Professional Documents
Culture Documents
of
Technical
Analysis
Issue 62
Summer-Fall 2004
SM
Incorporated 1973
Table of Contents
This issue of the Journal of Technical Analysis includes the most recent Charles H. Dow Award winner, Jason Goepfert.
His study is an update on work that has been done before by others but is extremely detailed and comprehensive, and adds
to our knowledge of how mutual fund cash positions reflect investment management opinion. Our old friend, and prior
editor of the Journal, Henry Pruden with two French professors, Bernard Paranque and Walter Baets, continue in their
discussion of a possible explanation for the connection between behavioral finance and stock market behavior. As always,
they introduce many new ideas to think about. And finally, we have an article by Buff Dormeier in which he devises a new
way to look at volume and price action together that appears to have some predictive ability.
Charles D. Kirkpatrick II, CMT, Editor
Jason Goepfert
12
18
Associate Editor
Michael Carr, CMT
Cheyenne, Wyoming
Manuscript Reviewers
Connie Brown, CMT
Aerodynamic Investments Inc.
Pawleys Island, South Carolina
Production Coordinator
Publisher
Barbara I. Gomperts
Manager, Marketing Services, MTA
Marblehead, Massachusetts
JOURNAL of Technical Analysis is published by the Market Technicians Association, Inc., (MTA) 74 Main Street, 3rd Floor, Woodbridge, NJ 07095. Its purpose
is to promote the investigation and analysis of the price and volume activities of the worlds financial markets. JOURNAL of Technical Analysis is distributed to
individuals (both academic and practitioner) and libraries in the United States, Canada and several other countries in Europe and Asia. JOURNAL of Technical
Analysis is copyrighted by the Market Technicians Association and registered with the Library of Congress. All rights are reserved.
MTA MEMBER
Member category is available to those whose professional efforts are spent
practicing financial technical analysis that is either made available to the investing public or becomes a primary input into an active portfolio management
process or for whom technical analysis is a primary basis of their investment
decision-making process. Applicants for Membership must be engaged in the
above capacity for five years and must be sponsored by three MTA Members
familiar with the applicants work.
MTA AFFILIATE
MTA Affiliate status is available to individuals who are interested in technical analysis and the benefits of the MTA listed below. Most importantly,
Affiliates are included in the vast network of MTA Members and Affiliates
across the nation and the world providing you with common ground among
fellow technicians.
DUES
Dues for Members and Affiliates are $300 per year and are payable when
joining the MTA and annually on July 1st. College students may join at a
reduced rate of $50 with the endorsement of a professor. Applicants for Member status will be charged a one-time application fee of $25.
essence of agents behaviour, we make the choice that reality is created via the
interaction of individual agents that create emergent behaviour. Using the words
of the famous Spanish poet Machado: there is no path; you lay down the path
in walking.
What do we understand by enacted and embodied cognition, within an
autopoetic system? An autopoetic system is a concept out of neurobiology that
describes behaviour of any neurobiological colony, including hence human
behaviour. An autopoetic system is one that organises and reproduces itself in
such a way that is ideal for survival. The human body is an excellent example
of an autopoetic system. Cells in the body continuously reproduce to allow the
body to survive. Furthermore, the body is completely self-organised. Within
such a system we can identify a mind (say an individuals mind) that is embodied, which means that it is not just embrained (the computer metaphor) but
literally distributed through the body via the sensors (the human senses) in
continuous contact with its environment.
The environment co-creates the mind. Cognition that eventually will lead
behaviour is then enacted. Enaction has two dimensions: action and shaping.
Therefore cognitive action always contains these two components: action and
creation. All the rest is information. We hit a common misunderstanding between knowledge and information. Information is static (and linear) and therefore can be copied and repeated, whereas knowledge is dynamic (and nonlinear) and therefore needs to be created each time over and again. Complexity
theory (Nicolis and Prigogine) has proven that perspective to us over the last
30 years. The enacted view on knowledge (and behaviour) allows us to explore
models that have creative force and show emergent behaviour.
An often made assumption, that we presume is too limited, is that rational
(human) behaviour could only be causal (based on the hidden ontological assumption described above). If it is causal one can write it down in equations
that in turn would drive reality. If we really believe in behavioural theories,
then let us take this to its finality: agent theory.
For clarity sake, we have already touched upon a few concepts of complexity theory (dynamic non-linear systems behaviour) that shed a completely different light on market behaviour (Baets, 1998a and b). Systems are autoorganisational, based on an embodied mind and on enacted cognition. Systems
and knowledge are each time over and again re-created (which is by the way
what our brain does, since it is the most efficient way of organisation). Reality
is not Newtonian (fixed time-space concept) but emergent (co-created in interaction). In my habilitation thesis I have called that The quantum structure of
business (Baets, 2004). Complexity theory goes much further, but for the
purpose of our argument, we can leave it here.
An interesting development, based on this complexity theory, is what we
know as artificial-life research (Langton) and one of its further developments,
i.e. agent based simulations (Holland). Agent based simulations is a development in artificial intelligence, that, different from what AI is unfortunately still
known for, i.e. expert systems, in that it exposes learning behaviour. Indeed
agent simulations are based on interaction of individual agents, that have individual qualities and purposes, and that agree upon a minimum set of interaction
rules. Behaviour is clearly dynamic and produced in the continuous interaction
of agents that exchange information with each other. The least one can say is
that this is very much like human behaviour, particularly in financial markets.
Where as catastrophe theory implies a time dimension, agent-based simulation
gives due importance to what Prigogine calls the constructive role of time.
Each instance we bring in the arrow of time, let us say the constructive role of
interaction, behaviour gets created; it literally emerges.
This view supposes a number of interacting agents, within a specific field
(of action) having each their personal qualities and goals and following a minimum set of interaction and exchange rules. The question then becomes how
such a complex system could come to a coherent state. Most suggestions go in
the same direction. Varela suggests resonance as the mechanism; Sheldrake
suggests morphogenetic fields: sense is made out of interaction in a non causal
way. This mechanism of resonance is what occurs in SWARM like societies
(Epstein and Axtell, 1996). In fact we are talking here of agent theories. In
6
agent theory, as already suggested we only have to identify the playing ground
(let us say a particular financial market) and a number of agents. Each agent is
autonomous in achieving his goal(s) and is of course gifted with qualities (like
experience, information, human characteristics). Those agents interact with
each other based on a minimum number of interaction rules. Those rules govern the behaviour in the simulation, but they also define the learning of the
different agents. Agents, then translating learning into (new) action, co-create
in interaction with each other, continuously new (and adapted behaviour). Indeed, in such a market the path is layed down in walking, just as reality
happens to be in financial markets.
The argument takes Catastrophe theory one step further to its intrinsic ultimate claim, i.e. that time plays a constructive role in (market) behaviour. In our
own research (Baets 2004 and 2005) a number of projects are undertaken using
agent theories, but not yet in the financial markets. Agent theories have been
successfully used to visualise the emergence of innovation in a large consumer
goods company. That was to visualize emergent market behaviour identifying
an adapted market introduction strategy; and also to study emergent states in
conflict handling.
The basic question does lead us back to the ontological choices we discussed earlier. Once we accept complexity theory as a promising paradigm we
cannot avoid the question of causality. Quantum mechanics has given the world
a tremendous dilemma. How is it possible that two photons moving in different
directions, still keep in instantaneous contact. As Pauli (Van Meijgaard), amongst
others, suggests, there should be indeed interaction in a non-local field. Things
seem to occur at the same time without having any causal relationship. It is
this quantum structure of (financial markets) that deserve our attention in order
to improve our understanding of market behaviour (Baets, 2004).
References
first is through the laws and other professional rules such as the SEC, Basle and
so on; the second is the availability of tools allowing actors to avoid the problems as the one in the article quoted above; the last is at the level of individual
and his/her own capacity to take into account the collective interest or so called
social welfare. Neoclassical economic theory says that, under specific hypothesis, market, and particularly financial market, is the best way to ensure the
right allocation of resources. But, since that hypothesis is never verified, we
need others tools to manage the market. We need to have tools to help us. We
wont speak about laws and regulations rules as Basle2 are prepared to impose.
Rather, we will focus on individual behavior.
A lot of criticism point out the myopia of the agents, the mimetism of their
decision, even these attitudes are the cause of the breakdown in the distribution
of social welfare. When one has lost his confidence in the others or, more
relevant, one decides to change because he is able to influence the market (in
fact it is the only way to win: because the winner needs a lot of losers, that
mean a lot of followers). It could be possible to anticipate the breakdown if we
are able to identify the the main proxies of these strategies, as was demonstrated in our article (Part A). I feel that our individual social responsibility is
of true significance. This responsibility cant be assumed without clear rules of
actions. I would like to take an example from the work of Jensen.
In an article published in October 2001, Jensen highlighted the operational
limitations of the prevailing interpretation/use made of value maximization and
the stakeholder theory. He engaged then a criticism of the central model of
entrepreneurship with the polar figure of the manager and the shareholder. In
addition, he wanted to introduce other stakeholders of the firm.
Without it being explicitly stated, it seems that the different financial scandals may have a bearing on the desire to explain the operating conditions proposed by the maximization versus stakeholder theories, which are, in some
ways competing, and in other ways complementary.
On the one hand, it is argued that value maximization for the shareholder,
with all the problems this type of monitoring entails, remains the best way to
attain social welfare in a market economy. On the other hand, stakeholder theory
stresses the need to take into account the interests of all of the stakeholders in a
firm, including the customers, all of the suppliers, and the employees. According
to Jensen, the complementarities of the two theories stems from the need to understand value maximization from a collective point of view: social welfare is
only achieved when, all of the value contributed by each of the stakeholders is
maximized, and when this maximization of value occurs over the long term. The
result is that the firm is recognized as a historical and complex organization.
However, an operational problem arises if managers are expected to maximize value thus defined in that there is no reason why the objectives of the
various stakeholders should coincide. This criticism is valid both from the
point of view of value maximization (how can several objectives be managed
simultaneously?) and that of stakeholder theory (how is a common objective to
be defined?).
In fact, if Jensen recognizes the relevance of the stakeholder theory, he
underlines a problem. This theory is not able to answer the question about how
to manage several aims which could diverge. He says, before managing the
firm and maximizing its value and taking into account the wishes of the stakeholders, there is the need to obtain an agreement; on the one hand about the
hierarchies of the aims, and on the other hand about the modalities of their
accomplishment and the monitoring of the performances of the firm.2
The agreement is the core of the deal and of the future performance because
it will determine the managers value maximization strategy, in particular in
the field of the organization of the firm. For the supporters of the stakeholder
theory there is a tool, the balanced scorecard, but, in accordance with Jensen,
they say nothing about the necessity to obtain beforehand agreement on the
objectives from every participant involved in the firm and then, on the way, to
build common rules to play by.
This concern, means the social welfare implies to deal with des problmes
dinformations, danticipation et dvaluation (Salais and alii, 1986, p 193).
In fact, at a collective level but also at an individual level, we need to agree on
a common reality, not only to build it but also to agree to act together in this
perspective: Lenjeu de ces ngociations est le modle dinterprtation retenir
pour construire la ralit qui se prsente eux [les agents] comme problme
rsoudre (id, p 197-198). In others words, this necessary negotiation expresses a convention through which laccord des agents sur leur description
du monde et leur [permettant] ainsi de coordonner leurs projets (id, p236) is
approved. That kind of agreement repose sur des processus sociaux
dlaboration de modles de reprsentation de la ralit (id, p239).
Then, the question is how to manage this agreement at a collective level
and at an individual level. We need to identify specific coordination principles
on which we can obtain an agreement from the stakeholder and the availability
of specific tools given the opportunity to manage the collective behavior by
anticipating the risk of breakdown thats meaning the behavior of the one who
does not play with the same aim. But, it is not possible to negotiate this kind of
agreement without discussing the relevance of criteria of management and the
sense of performance, and then the different meaning between the stakeholders. For example, from the workers point of view, the starting point must be the
value added and not the EBITDA or the cash flow, because the value added is
the condition of their wages, despite the fact that the wages have an influence
on the profit.3
In total, entreprendre avec efficacit suppose de matriser lincertitude relative aux marchs, aux technologies et aux produits futurs, la cohrence de ses
propres projets par rapport ceux des autres agents, partenaires ou concurrents. (id,p246).
Nevertheless, the main point is the coordination of the agents behaviour
which deal with the uncertainty management.
Dans un contexte de relations aux autres dont on ne peut faire abstraction, lincertitude tenant la personne doit tre comprise comme une incertitude communicationnelle. Cependant, cette dsignation est elle-mme ambigu,
car elle pourrait laisser penser que lincertitude se rsume un problme de
circulation de linformation, une imperfection. Or une information ne peut
circuler que si elle a t au pralable labore dans un langage commun et que
si, par consquent, elle peut sajuster de part et dautre dans un dispositif qui
lui soit congruent (par exemple, la prsence de codes identiques) (Salais,
Storper 1993, pp 76-78) .
(En anglais: Wages regulation system (la forme salaire in French) maintains workforce unaware of the work that has been achieved (page 255 Salais
et alii) to the extent that the accomplished work is revealed through the produced value once the intermediary consumptions have been paid, namely the
added value (see page 227 as well as the written work by Paul Boccara on the
subject at hand, 1985).
References
In the original article (Part A) expressions of fear were observed as growing and expressions of greed were observed as shrinking as the market price
behavior neared the breakdown, the catastrophe jump point. One principle of
Sequential Art is that of the interdependance between the sounds (musical notes)
and the visual indications shown on the charts. The sound (words, musical
notes) and the marks on the chart picture go hand in hand to convey the idea of
changing market sentiment that neither could convey alone. Here it can be seen
the importance of how sounds, words, musical notes and pictorial indicators
support each others strengths. This gives rise to the suggestion that technical
analysis ought to include the careful annotation of junctures where it can be
seen that behavior is changing on a chart. Please notice how poignant the use
of icons, in this case musical symbols for high and low offers and bids that
convey the changing juxtaposition of fear vs. greed. The combination of sound
and visual clues also suggest a superior means of conveying the distinct characteristics of a sentiment indicator; they are a fine way of communicating the
emotional content of the information.
2.) TRADING RANGE CHANNELS ALONG TOPS AND BOTTOMS
The data shown in the trading range provides a good opportunity to cover
the essentials of Sequential Art. First there is the ideal purpose, which in the
case of trend channels is to outline the expected future course of price behavior... the idea is to define and extrapolate. The form employed was that of
displaying in graphic panels price behavior over time, which is to say, to create
a chart. A different type of form could have been the depiction of the trend
through averaging and simplifying the data into a moving average. Thus the
technical analyst as a practical artist has choices to make in the selection of
form.
Another artful choice is the structure of the sequencing of chart data over
time. A trend channel only makes sense if it has a beginning and an ending. In
the case of the Cal Tech Experiment there were three structural sections: the
trend of prices that reflected the progress, the growth, of speculation. Then
there was a section labelled the dissipative gradient. That panel too could have
been made even more distinctive off through a change in colors, say from green
toyellow. Then the third section that could have been framed was the panic/
crash in prices, and this third panel could have been further seperated with the
addition of the color red.
The artist / technician would thus set up a sequence of meaningful parts or
patterns that taken together would tell the story of boom and bust upon the
surface of the market. At a deeper level of analysis the separation into three
distinct yet interdependant sequential panels fits with the technicians vocabulary and the iconography of chart patterns that have been established through
experience to communicate meaning as to the present position and probable
future trend of a market. The separations into a sequence of separate panels
really clarify the picture and tell the story of the market.
3. DESCENDING PRICE PEAKS
In this case we cut incisively into the available chart to abstract a sequential
order of events that the technical analyst then moulds into a pattern. An icon for
symbolizing the motion into the future and in the downward direction defined
might be the famous abstract art depiction of Marcel Duchamps A Nude Descending a Staircase. Indeed the entire cubist movement in visual art might
be a rich area for a technical analyst to study. The parallels of the cubist art
form and the technical analyst are strong and suggest that borrowing from art
depicting motion might pay the technical analyst a large dividend. Among other
things the analyst can become sensitized connecting the dots of the descending price peaks reveals a picture plane and gives closure to the unifying properties and makes the viewer more aware of the design, the trend, as a whole rather
than simply the individual components. This in effect is the beauty of trendlines.
In support of simple trendlines it has been observed that Duchamp, more concerned with the idea of motion than the sensation, would eventually reduce
such concepts as motion to a single line.5
The moment-to-moment and action-to-action progression of the abstraction of triple descending peaks, does not require too much involvement by
the viewer to interpret the meaning. It is clear-cut, decisive and powerful. Furthermore, the actions and intentions of the buyers and sellers which underly the
descending peaks lend themselves to a common sense interpretation for grasping the implication of descending price peaks. Implicit in the foregoing conclusion is the realization that effective chart interpretation involves the analyst
who identifies and frames sequences and then the observer who reads and internalizes the sequences of panels and labels to inform himself/herself of the
motion revealed and the action required.
4. CATASTROPHE PANICS CAUSING PRICE GAPS
The discontinuity of price transaction behavior creates visual gaps or gutters that separate panels of price action. It is the acute imbalance between
supply and demand which creates those gaps. Sequential Art identifies these
gaps or separations where nothing has been recorded as gutters. What attracts
the analysts attention is the comparison of panels of price action before and
after a gap. Why? Because the comparison has forecasting implications. The
gaps or gutters fracture both time and space offering a jagged, staccato rhythm
of unconnected moments. But the observers ability to construct continuity across
panels generates the ability to mentally create closure. Like Sequential Art,
this illustration of gap analysis reinforces that technical market analysis of chart
behavior is very much an interplay between the observer and the observed.
Footnotes
1 Scott Mc Cloud, Understanding Comics: The Invisible Art, The Kitchen
Sink Press CA Division of Harper and Collins) 1993
2 Those interested may read a comment, in french, in Paranque (2004).
3 La forme salaire maintient les salaris dans la mconnaissance du travail
accompli (page 255 Salais et alii) dans la mesure o ce travail accompli
sexprime dans la valeur produite une fois les consommations intermdiaires
payes, savoir la valeur ajoute ( voir page 227 et les travaux de Paul
Boccara sur le sujet, 1985).
Appendix
Figure 1. A Cusp Catastrophe Model of a Stock Exchange
Footnotes
10
1998 by Wiley. In 1999, he edited Complexity and Management: A collection of essays, published by World Scientific Publishing. Recently he
co-authored Virtual Corporate Universities, published 2003 by Kluwer
Academic.
DR. BERNARD PARANQUE
Bernard Paranque is a doctor of economics ( University of Lyon Lumi re
- 1984) and holds the Habilitation diriger les recherches (1995). He
began his career as an associate economist in an accountancy firm in 1984.
In 1990, he joined the Banque de France (French Central Bank) business department. From 1990 to 2000 he produced papers on the financial
structure of non-financial companies (www.ssrn.com). He was a representative of the Banque de France in the European Committee of Central Balance Sheet Offices between 1993 and 2002.
In 1999, he was on secondment from the Banque de France to the Secretary of State to SMEs where he was in charge of the business financing
department. He was also a member of the French delegation to the SMEs
working party of the Business and Environment Committee of the OECD.
His research refer to the conomie des conventions and are focused
on the financial behavior of the non-financial organization and the promotion of specific tools and assessment procedures designed to enhance SMEs
access to financing.
He is co-author with Bernard Belletante and Nadine Levratto of Diversit
conomique et mode de financement des PME published in 2001. He is
also the co-author of Structures of Corporate Finance in Germany and
France with Hans Friderichs in Jahrbcher fr National konomie und
Statistik, 2001.
He is associate researcher of the CNRS team IDHE-ENS Cachan in
Paris and member of the New York Academy of Science.
He joins Euromed Marseille Ecole de Management as Professor of Finance and Head of the Information and finance department.
DR. HENRY O. PRUDEN
Hank Pruden is a visiting scholar at Euromed Marseille Ecole de Management, Marseille, France during 2004-2005. Professor Pruden is a professor in the School of Business at Golden Gate University in San Francisco, California where he has been teaching for 20 years. Hank is more
than a theoretician, he has actively traded his own account for the past 20
years. His personal involvement in the market ensures that what he teaches
is practical for the trader, and not just abstract academic theory.
He is the Executive Director of the Institute of Technical Market Analysis (ITMA). At Golden Gate he developed the accredited courses in technical market analysis in 1976. Since then the curriculum has expanded to
include advanced topics in technical analysis and trading. In his courses
Hank emphasizes the psychology of trading and as well as the use of technical analysis methods. He has published extensively in both areas.
Hank has mentored individual and institutional traders in the field of
technical analysis for many years. He is presently on the Board of Directors
of the Technical Securities Analysts Association of San Francisco and is
past president of that association. Hank was also on the Board of Directors
of the Market Technicians Association (MTA). Hank has served as vice
chair, Americas IFTA (International Federation of Technical Analysts): IFTA
educates and certifies analysts worldwide. For eleven years Hank was the
editor of The Market Technicians Association Journal, the premier publication of technical analysts. From 1982 to 1993 he was a member of the Board
of Trustees of Golden Gate University.
11
12
With 591 data points, the probability of the correlation between cash levels
and interest rates being due to chance alone is essentially zero.
Using regression analysis, we can see the relationship between interest rates
and cash reserves. This will allow us to determine what an appropriate level
of cash reserves may be given a certain interest rate, which we can then use to
compare to current cash reserves. If current reserves are too low given prevailing rates, then fund managers may be overly optimistic; if they are too high,
then they may be excessively pessimistic.
Using data from 1954 through 2003, the regression formula for the relationship between interest rates and mutual fund cash reserves is:
y = 0.4978x + 4.5464
where
y = expected cash reserve
x = current rate on 90-day T-Bills
We can round off these figures and still retain the usefulness of the formula.
Put into different terms, the regression formula tells us that cash reserves during any given month should be approximately 4.5% plus 50% of the current
yield on 90-day T-Bills. Theoretically, if 90-day TBills were yielding 0%, then
mutual funds would be expected to carry 4.5% of their assets in liquid investments. This is a baseline amount of cash, presumably needed to cover expenses, redemptions and the like.
13
We know that in June 2004, cash reserves were at 4.3% of total assets. On
June 30th, the yield on 90-Day T-bills was 1.31%. By plugging that value into
the regression formula, we estimate that mutual funds should have carried 5.20%
of their assets in cash. By taking the difference between what was expected and
what was fact, we can conclude that mutual funds were carrying a cash deficit of 0.90%:
Actual
4.30%
Expected
5.20%
Surplus/(Deficit)
(0.90%)
By going back and comparing actual levels of cash to those that were expected given the prevailing level of interest rates, we can get a better handle on
the sentiment of portfolio managers without the distorting effects of interest
rates on cash reserves. The difference between actual and expected reserves
will show that fund managers are giving a premium or discount to cash,
and should create an effective contrary sentiment indicator. For purposes of
brevity, we will call the difference between actual and expected cash reserves
RAPAD (Rate Adjusted Premium And Discount).
Figure 3 shows how the S&P 500 performed for up to 2 years after mutual
funds were holding cash reserves that were at least 2.25% less than they should
have been given the level of short-term interest rates at the time. The primary
reason for giving cash such a discount was likely that the fund managers felt
very optimistic about the future gains they were likely to make in the stock
market, so they felt the need to be as fully invested as possible. As we can see
from the table, this optimism was generally unwarranted. If we look at the
results after 12 months, the S&P 500 showed an average return of -6.1%.
Looking at the months where cash levels were in a normal range (meaning RAPAD readings within 1.5 standard deviations of the mean), the average
12 month return in the S&P 500 was 8.7% during the study period. One-year
returns after extreme cash discounts therefore underperformed an average return by 14.8%. We also see from Figure 3 that the S&P 500 was higher 12
months later only 22% of the time. There were 36 months that were considered
to show an extreme cash discount, and only 8 times out of those 36 instances
was the S&P 500 higher one year later.
Figure 4 gives us the performance after periods of extreme cash premiums,
meaning those times when fund managers held at least 2.25% more cash than
expected. The results here are markedly different from Figure 3. After 12
months, the S&P 500 was an average of 14.1% higher, outperforming an average month by 5.4%. Out of the 53 months that qualified as exhibiting an extreme cash premium, 47 lead to a higher market one year later, for a success
rate of 89%. See Appendix A for a detailed list of all extreme RAPAD readings during the study period.
Figure 5 below shows the correlation between RAPAD readings and S&P
500 returns 12 months later.
FIGURE 5
S&P 500 12-MONTH RETURNS AND RAPAD READINGS
6 Months
Later
-3.0%
31%
12 Months
Later
-6.1%
22%
18 Months
Later
-5.5%
36%
24 Months
Later
-1.8%
50%
FIGURE 4
S&P 500 Performance After RAPAD Reading of +2.25% or
Above (Extreme Cash Premium)
Average Return
Percent Positive
14
6 Months
Later
8.3%
81%
12 Months
Later
14.1%
89%
18 Months
Later
19.4%
98%
24 Months
Later
23.0%
100%
The correlation between RAPAD readings and returns in the S&P 500 one
year later is 0.32, suggesting that if we knew nothing else but what the current
RAPAD reading was, we could improve our prediction of where the S&P 500
would close one year later by about 11%.
FIGURE 6
CASH LEVELS VS. RATE-ADJUSTED CASH LEVELS
Figure 6 shows us a plot of the S&P 500 (top scale), the raw values of
mutual fund cash reserves (middle scale) and the RAPAD measure of cash premiums and discounts (lower scale).
On the chart, Point A corresponds to July 1976. At the time, 90-Day T-Bills
were yielding about 5.2%. According to the regression formula, mutual funds
should have been holding about 7.1% of their assets in cash. However, they
were holding only 4.7% cash, so they were holding about 2.4% less cash reserves than they should have been given the level of short-term rates at the
time. This was a show of extreme optimism on the part of fund managers, and
the S&P refused to accommodate by declining into the beginning of 1978.
By early 1980, managers had built up their cash reserves once more, just in
time for a stiff market rally over the next year. In January 1981 (Point B on the
chart), 90-Day T-Bill rates had climbed all the way up to 14.6%, giving fund
managers a very enticing incentive to hold large amounts of cash. They did have
significantly more cash then than they did in 1976. At Point A, cash levels were
around 4.7%, as stated above. At Point B, cash levels stood at 8.3%. Taken on its
own, one could have easily concluded that fund managers were nowhere near as
optimistic at Point B than they were at Point A. However, when we factor in
prevailing interest rates, theoretically fund managers should have been holding
11.8% of their assets in cash at the time. Since they only had 8.3% in cash
reserves, they were once again deficient by an extreme amount (3.5%). This told
us that fund managers were indeed too optimistic, contrary investors should
have expected a market decline (or at least difficulty making much headway),
and the S&P ultimately declined sharply over the next one and a half years.
Other Factors
In the beginning paragraph, we highlighted several other factors, besides
competing assets, which may affect mutual fund cash reserves. We have looked
at what relation some of those have on cash reserves, and there does seem to be
a correlation. However, since many of these developments are so new, we do
not have enough data to draw reliable conclusions. Still, it is instructive to
discuss the impacts of these variables on cash reserves so that we can more
readily observe their impact going forward.
The listed options market has grown steadily over the past 10 years. In
1993, the Chicago Board Options Exchange was clearing approximately
9,000,000 options contracts on a monthly basis. By the end of 2003, that volume had tripled. The correlation between monthly options volume on the CBOE
15
and mutual fund cash levels from 1993 - 2003 was -0.66. This tells us that there
was a large negative correlation between option volume and cash levels - as
option volume increased, cash levels decreased. This could be a significant
factor, however we are limited by a lack of reliable option volume data. Also,
interest rates during this time were steadily decreasing. As we saw above, interest rates have had a definite impact on cash levels over nearly 50 years of data,
so it is difficult to determine if cash levels were impacted more by option activity or by interest rates.
We also checked the correlation between cash reserves and futures market
activity. For the latter, we used commercial trader positions (both long and short)
in the large S&P 500 futures contract from 1986 - 2003. According to the Commodity Futures Trading Commission (CFTC), a commercial trader is a large
trader (the definition of large has changed over the years) engaged in the futures market for the specific purpose of hedging the traders daily business activity. Comparing month-end positions in the Commitments of Traders report,
there was a correlation of -0.80 between the futures positions and mutual fund
cash reserves. This is a very tight correlation and tells us that as futures positions increased, cash reserves decreased and vice-versa. Once again, however,
we are limited by the fact that interest rates declined for most of this period.
It would be possible to use a multiple regression formula to determine where
mutual fund cash reserves should be, instead of using only interest rates as
described above. However, until more time goes by where we see varying levels of option and futures activity, the utility of that exercise is probably limited.
Another likely factor in cash balances is the impact of rising or falling market prices themselves, regardless of fund manager sentiment. When market prices
rise, we should see a decline in the percentage of assets held in cash, simply
because the total portfolio is worth more than it was before.
To check whether this may be the case or not, we looked at the month-tomonth percentage change in the S&P 500 and compared it to the month-tomonth change in mutual fund cash balances. The correlation was -0.38, which
means that it could be statistically possible for one factor to account for around
14% of the movement in the other. Since the correlation is negative, it helps to
confirm the theory that cash balances would fall when prices rise and vice versa.
The same negative correlation holds (although it falls to -0.20) when we look at
this months change in the S&P 500 and next months change in cash balances.
This is another significant factor and should be included in any discussion about
whether cash levels have moved an inordinate amount in any given period.
However, it is important to distinguish those times when funds are holding
low levels of cash because they are overly optimistic versus those times when
they are holding low cash reserves because there are few other alternatives.
With 90-Day T-Bill rates yielding barely above 1% at the time, June 2004 was
certainly one of the latter.
This does not mean the stock market cannot - or should not - decline, it
simply means that overzealous fund managers are not necessarily a catalyst. If
we see rates rise significantly in 2004, but cash reserves at mutual funds hold
steady or decline, then there may be some real evidence that fund managers are
excessively optimistic.
As we saw from Figure 3, overly optimistic portfolio managers are a good
sign that whatever rally is in place may be about to lose steam.
Recent Activity
As of June 2004, mutual funds held 4.3% of their total assets in liquid assets. Given the low level of short-term interest rates at the time, it is not entirely
unexpected that cash reserves would be so low. Still, anytime the absolute level
of cash is low, we believe investors should be worried. While fund companies
have better reporting systems now than they did 20 years ago, there is still the
possibility of a cash crunch, whereby unexpected redemptions cause heavy
selling by mutual funds to meet the redemptions since they do not have adequate cash on hand to cover them.
This of course would exacerbate the market decline that is likely the reason
for the redemptions in the first place.
16
Sources
Investment Company Institute (http://www.icinet.net/)
The Federal Reserve Bank of St. Louis (http://www.stlouisfed.org/)
Appendix A
The table below outlines each month where the RAPAD reading was considered extreme. The table gives the month of the occurrence, the S&P 500
cash index level at the time, the RAPAD reading for that month, and the return
in the S&P 500 cash index 6, 12, 18 and 24 months later.
Date
S&P 500
RAPAP
6 Mo.
Later
24 Mo.
Later
All Occurrences with RAPAD Readings of -2.25 and Below (Extreme Cash Deficit)
1/30/81
129.55
-3.55
1.1%
-7.1%
-17.3%
12.2%
3/31/00
1498.58
-3.39
-4.1% -22.6%
-30.5%
-23.4%
4/30/81
132.81
-3.22
-8.2% -12.3%
0.7%
23.8%
7/31/81
130.92
-3.06
-8.0% -18.2%
11.0%
24.2%
5/29/81
132.59
-2.99
-4.7% -15.6%
4.5%
22.5%
2/27/81
131.27
-2.98
-6.5% -13.8%
-9.0%
12.8%
2/29/00
1366.42
-2.94
11.1%
-9.3%
-17.0%
-19.0%
8/31/81
122.79
-2.94
-7.9%
-2.7%
20.6%
33.9%
1/31/00
1394.46
-2.90
2.6%
-2.0%
-13.1%
-19.0%
8/31/00
1517.68
-2.90
-18.3% -25.3%
-27.1%
-39.6%
1/31/73
116.03
-2.84
-6.7% -16.8%
-31.6%
-33.7%
6/30/81
131.21
-2.82
-6.6% -16.5%
7.2%
27.8%
12/31/99 1469.25
-2.82
-1.0% -10.1%
-16.7%
-21.9%
9/30/71
98.34
-2.81
9.0%
12.4%1
3.4%
10.3%
4/30/98
1111.75
-2.77
-1.2%
20.1%
22.6%
30.6%
12/29/72
118.05
-2.77
-11.7% -17.4%
-27.1%
-41.9%
11/28/80
140.52
-2.69
-5.6% -10.1%
-20.4%
-1.4%
6/30/71
98.70
-2.62
3.4%
8.6%
19.6%
5.6%
5/29/98
1090.82
-2.58
6.7%
19.3%
27.3%
30.2%
5/31/71
99.63
-2.56
-5.7%
9.9%
17.1%
5.3%
12/31/80
135.76
-2.55
-3.4%
-9.7%
-19.3%
3.6%
7/31/00
1430.83
-2.55
-4.5% -15.3%
-21.0%
-36.3%
3/31/98
1101.75
-2.55
-7.7%
16.8%
16.4%
36.0%
11/30/99 1388.91
-2.51
2.3%
-5.3%
-9.6%
-18.0%
6/30/00
1454.60
-2.49
-9.2% -15.8%
-21.1%
-32.0%
5/31/72
109.53
-2.48
6.5%
-4.2%
-12.4%
-20.3%
4/28/00
1452.43
-2.46
-1.6% -14.0%
-27.0%
-25.9%
7/30/71
95.58
-2.44
8.7%
12.4%
21.4%
13.2%
6/30/98
1133.84
-2.42
8.4%
21.1%
29.6%
28.3%
9/30/76
105.24
-2.42
-6.5%
-8.3%
-15.2%
-2.6%
7/30/76
103.44
-2.38
-1.4%
-4.4%
-13.7%
-2.7%
10/29/76
102.90
-2.33
-4.3% -10.3%
-5.9%
-9.5%
6/30/76
104.28
-2.33
3.0%
-3.6%
-8.8%
-8.4%
3/31/81
136.00
-2.28
-14.6% -17.7%
-11.5%
12.5%
8/31/76
102.91
-2.26
-3.0%
-6.0%
-15.4%
0.4%
9/29/00
1436.51
-2.26
-19.2% -27.5%
-20.1%
-43.2%
Average Return
-3.0%
-6.1%
-5.5%
-1.8%
Number of Occurrences
36
36
36
36
Number of Positive Occurrences
11
8
13
18
Positive Occurrences as % of Total
31%
22%
36%
50%
Maximum Return
11.1%
21.1%
29.6%
36.0%
Minimum Return
-19.2% -27.5%
-31.6%
-43.2%
All Occurrences with RAPAD Readings of +2.25 and Above (Extreme Cash Surplus)
10/29/93
467.83
2.25
-3.6%
1.0%
10.0%
24.3%
11/29/74
69.97
2.26
30.3%
30.4%
43.2%
45.9%
11/28/86
249.22
2.27
16.4%
-7.6%
5.2%
9.8%
1/31/94
481.61
2.28
-4.8%
-2.3%
16.7%
32.1%
5/30/80
111.24
2.29
26.3%
19.2%
13.6%
0.6%
6/30/88
273.50
2.29
1.5%
16.3%
29.2%
30.9%
6/30/58
45.24
2.31
22.0%
29.2%
32.4%
25.8%
4/30/86
235.52
2.32
3.6%
22.4%
6.9%
11.0%
6/30/93
450.53
2.35
3.5%
-1.4%
1.9%
20.9%
7/31/86
236.12
2.37
16.1%
35.0%
8.9%
15.2%
6/29/90
358.02
2.40
-7.8%
3.7%
16.5%
14.0%
7/31/70
78.05
2.41
22.8%
22.5%
33.2%
37.6%
8/31/88
261.52
2.42
10.5%
34.4%
26.9%
23.3%
7/31/90
356.15
2.43
-3.4%
8.9%
14.8%
19.1%
5/31/93
450.19
2.44
2.6%
1.4%
0.8%
18.5%
6/30/92
408.14
2.48
6.8%
10.4%
14.3%
8.9%
8/29/86
252.93
2.48
12.4%
30.4%
5.9%
3.4%
10/30/92
418.68
2.48
5.1%
11.7%
7.7%
12.8%
11/30/92
431.35
2.53
4.4%
7.1%
5.8%
5.2%
9/30/88
271.91
2.54
8.4%
28.4%
25.0%
12.6%
10/31/66
80.20
2.55
17.2%
16.3%
21.5%
28.9%
7/30/93
448.13
2.55
7.5%
2.3%
5.0%
25.4%
2/29/88
267.82
2.56
-2.4%
7.9%
31.2%
23.9%
10/31/86
243.98
2.57
18.2%
3.2%
7.1%
14.3%
10/31/85
189.82
2.57
24.1%
28.5%
51.9%
32.6%
2/28/94
467.14
2.58
1.8%
4.3%
20.3%
37.1%
7/29/88
272.02
2.59
9.4%
27.2%
21.0%
30.9%
9/30/92
417.80
2.61
8.1%
9.8%
6.7%
10.7%
7/31/92
424.21
2.67
3.4%
5.6%
13.5%
8.0%
5/31/88
262.16
2.75
4.4%
22.3%
32.0%
37.8%
11/30/89
345.99
2.78
4.4%
-6.9%
12.7%
8.4%
2/26/93
443.38
2.79
4.6%
5.4%
7.2%
9.9%
1/29/88
257.07
2.85
5.8%
15.7%
34.6%
28.0%
5/31/90
361.23
2.90
-10.8%
7.9%
3.9%
15.0%
3/31/88
258.89
2.91
5.0%
13.9%
34.9%
31.3%
9/30/86
231.32
2.97
26.1%
39.1%
11.9%
17.5%
8/31/92
414.03
2.98
7.1%
12.0%
12.8%
14.8%
4/30/87
288.36
3.00
-12.7% -9.4%
-3.3%
7.4%
1/31/90
329.08
3.10
8.2%
4.5%
17.8%
24.2%
3/31/93
451.67
3.11
1.6%
-1.3%
2.4%
10.9%
2/28/90
331.89
3.19
-2.8%
10.6%
19.1%
24.3%
10/30/87
251.79
3.23
3.8%
10.8%
23.0%
35.2%
4/29/88
261.33
3.28
6.8%
18.5%
30.2%
26.6%
3/30/90
339.94
3.47
-10.0% 10.4%
14.1%
18.8%
4/30/93
440.19
3.51
6.3%
2.4%
7.3%
16.9%
8/31/90
322.56
3.67
13.8%
22.6%
27.9%
28.4%
12/31/90
330.22
3.75
12.4%
26.3%
23.6%
31.9%
9/30/74
63.54
3.99
31.2%
32.0%
61.7%
65.6%
11/30/87
230.30
4.06
13.8%
18.8%
39.2%
50.2%
4/30/90
330.80
4.08
-8.1%
13.5%
18.6%
25.4%
11/30/90
322.22
4.36
21.0%
16.4%
28.9%
33.9%
9/28/90
306.05
4.60
22.6%
26.7%
31.9%
36.5%
10/31/90
304.00
4.81
23.5%
29.1%
36.5%
37.7%
Average Return
Number of Occurrences
Number of Positive Occurrences
Positive Occurrences as % of Total
Maximum Return
Minimum Return
8.3%
53
43
81%
31.2%
-12.7%
14.1%
53
47
89%
39.1%
-9.4%
19.4%
53
52
98%
61.7%
-3.3%
23.0%
53
53
100%
65.6%
0.6%
The figures below give the average of all data during the study period
Average Return
4.1%
8.1%
12.2%
16.6%
Number of Occurrences
585
579
573
567
Number of Positive Occurrences
386
403
419
448
Positive Occurrences as % of Total
66%
70%
73%
79%
Appendix B
Trade signals using the extremes in the RAPAD mutual fund cash level
as entries and exits.
Biography
Jason Goepfert is the President and CEO of Sundial Capital Research,
Inc., a firm focused on the research and practical application of mass
psychology to the financial markets. Prior to founding Sundial, Jason
managed the operations of a large discount brokerage firm and a multibillion dollar hedge fund, experience which firmly planted the idea that
logic rarely trumped emotion when it came to traders investment decisions.
Sundial trades proprietary capital and releases its research to institutional
clients and individual investors via its web site, www.sentimenTrader.com
17
18
19
the short-term average volume for 10 days is 1.5 million shares a day, and the
long-term volume average for 50 days is 750,000 shares per day. The VM is 2
(1,500,000/750,000). This calculation is then multiplied by the VPC+/- after it
has been multiplied by the VPR.
Now we have all the information necessary to calculate the VPCI. The
VPC+ confirmation of +1.5 is multiplied by the VPR of 1.25, giving 1.875.
Then 1.875 is multiplied by the VM of 2, giving a VPCI of 3.75. Although this
number is indicative of an issue under very strong volume-price confirmation,
this information serves best relative to the current and prior price trend and
relative to recent VPCI levels. Discussed next is how best to use the VPCI.
Using VPCI
Confirming Signals
VPCI
Rising
Declining
Rising
Declining
Price-Trend
Relationship
Confirmation
Contradiction
Confirmation
Contradiction
Implications
Bullish
Bearish
Bearish
Bullish
VPCI in Action
In our first example (Figure 1), the price trend of SIRI is rising and the
VPCI is also rising. Here the VPCI is giving three bullish signals, the most
important being that the VPCI is rising. Increasing volume and price confirmation demonstrate strengthening commitment to the existing price trend of demand. Secondly, VPCI smoothed is rising and the VPCI has crossed above it,
indicating momentum within the confirmation. This is a good indication that
the existing bullish price trend will continue. Last and least important, both the
VPCI and VPCI smoothed are above the zero line, indicating a healthy longerterm accumulation. All of these VPCI indications are interpreted as bullish only
because SIRIs prevailing trend is rising.
A falling stock price and a rising VPCI (Figure 3) is an example of volumeprice confirmation. In our illustration, GSKs stock price is falling and the VPCI
is rising, indicating control is clearly in the hands of sellers. The VPCI moves
gradually upward, supporting the downward price movement. Gaining momentum, the VPCI crosses above zero and eventually through the VPCI smoothed.
GSKs stock price breaks down shortly afterwards on the selling pressure.
20
Putting it all together, let us take a look at one final example of the VPCI in
action (Figure 5). Its extremely important to note when using VPCI that volume leads or precedes price action. Unlike most indicators, the VPCI will
often give indications before price trends are clear. Thus, when a VPCI signal
is given in an unclear price trend, it is best to wait until one is evident. This
final example is given in a weekly timeframe to illustrate VPCI signals in a
longer-term cycle.
At Point 1 in Figure 5, CMX is breaking out and the VPCI confirms this
breakout as it rapidly rises, crossing over the VPCI smoothed and zero. This is
an example of a VPCI bullish confirmation. Later, the VPCI begins to fall
during the uptrend, suggesting a pause within the new uptrend. This small
movement is a bearish contradiction. At Point 2, CMXs price falls as the
VPCI continues to fall below zero and eventually through the VPCI smoothed,
gaining momentum. This is a classic example of a countertrend VPCI bullish
contradiction. At Point 3, the VPCI has bottomed out and with CMX begins to
rise, confirming the last VPCI signal. Later, in Point 3, VPCI moves upward,
supporting the higher price movement. By Point 4, CMX breaks through resistance, while the VPCI upward momentum accelerates rapidly, crossing the VPCI
smoothed and zero. From this bullish confirmation, one could deduce a high
probability of a price breakout, illustrating bullish confirmation once again.
Figure 5. VPCI in action CMX
Applying the VPCI information to a trading system should improve profitability. To evaluate this VPCI hypothesis, it was tested via a trading system,
contrasting two moving average systems. The goal of this study was not to
achieve optimum profitability but to compare a system using VPCI signals to
that of a system not using them. The crossing of the five-day and 20-day moving averages was used to generate buy and sell signals. The five-day moving
average represents the cost basis of traders in a one-week timeframe. The 20day moving average represents the cost basis of traders in a one-month
timeframe. The shorter moving average is more responsive to current price
action and trend changes, because it emphasizes more recent price changes.
The longer-term moving average comprises more information and is more indicative of the longer-term trend. Because its scope is broader, the longer-term
moving average normally lags behind the action of the shorter moving average.
When a moving average curls upward, the investors within this timeframe are
experiencing positive momentum. The opposite is true when the moving average curls downward. When the short-term moving averages momentum is significant enough to cross over the longer-term moving average, this is an indication of a rising trend, otherwise known as a buy signal. Likewise, when the
shorter-term moving averages momentum crosses under the longer-term moving average, a sell signal is generated.
Back-tested first was a five- and 20-day crossover system. A long position
is taken when the short-term moving average crosses above the long-term moving average. A short position is taken when the short-term moving average
crosses under the long-term moving average. These actions tend to represent
short-term changes in momentum and trend. In the comparative study, I used
the same five- and 20-day crossover, but kept only the trades when the VPCI
had crossed over a smoothed VPCI. This indicates a rising VPCI or price confirmation. The VPCI settings will be the same as the moving averages, 20 days
for the long-term component and five days for the short-term component. The
VPCI smoothed is the 10-day average of the VPCI.
There are a number of limitations to a study framed this way but these
settings were chosen deliberately to keep the study simple and uncompromised.
First, the five- and 20-day moving average settings are too short to indicate a
strong trend. This detracts from the effectiveness of the VPCI as an indicator
of price trend confirmation or contradiction. However, although these settings
are short, they provide more trades than a longer-term trend system, creating a
more significant sample size. Also, the VPCI settings at five and 20 days,
21
when the price data is only 20 days old (length of the long-term moving average) are too short. By using these time settings, the VPCI may give indications
ahead of the price trend or momentum signals given by the moving average.
However, changing the settings could be interpreted as being optimized. Accordingly, a 10-day lookback delay on the VPCI and a five-day lookback delay
on the VPCI smooth was installed. This delay gives the VPCI confirmation
signal more synchronicity with the lagging moving average crossover. Ideally,
VPCI delays should be tuned in to the individual issue. My testing has shown
more responsive high volume and high-volatility issues generally do not require delays as long as slower-moving low volume and low-volatility issues.
One could also use trend lines corresponding to the timeframe being applied to
tune the VPCI.
To ensure a broad scope within the sample being studied, the test was broken into several elements. Securities were selected across three areas of capitalization: small, as measured by the S&P Small Cap Index; medium, as measured by the S&P 400 Mid Cap Index; and large, as measured by the S&P 100
Large Cap Index. Equally important are the trading characteristics of each
security. Thus, securities were further characterized by volume and volatility.
Combining these seven traits forms a total of 12 groups: small cap high volume, small cap low volume, small cap high volatility, small cap low-volatility,
mid cap high volume, mid cap low volume, mid cap high volatility, mid cap
low volatility, large cap high volume, large cap low volume, large cap high
volatility and large cap low volatility (Table 2).
Table 2. Sixty Securities Organized by Size, Volume and
Volatility
Large Cap
Low Volatility
PG
SO
BUD
WFC
PEP
Large Cap
High Volatility
EP
AES
DAL
ATI
NSM
Large Cap
High Volume
CSCO
MSFT
INTC
ORCL
GE
Large Cap
Low Volume
ATI
HET
BDK
GD
CPB
Mid Cap
Low Volatility
MDU
ATG
WPS
HE
NFG
Mid Cap
High Volatility
ESI
LTXX
NDN
WIND
SEPR
Mid Cap
High Volume
ATML
SNDK
COMS
MLMN
CY
Mid Cap
Low Volume
WPO
BDG
TECUA
KELYA
CRS
Small Cap
Low Volatility
UNS
UBSI
CIMA
CTCO
ATO
Small Cap
High Volatility
BRKT
ZIXI
MZ
LENS
CRY
Small Cap
High Volume
CYBX
MOGN
KLIC
HLIT
YELL
Small Cap
Low Volume
NPK
GMP
SXI
SKY
LAWS
22
Commissions were not included. The testing period used was August 15, 1996,
to June 22, 2004, for a total of 2,000 trading days. The results were measured
in terms of profitability, reliability and risk-adjusted return.
Profitability
Profitability was tested using a five- and 20-day moving average crossover
and then retested using only those trades also displaying VPCI confirmation
signals. The results were impressive (Figure 6). Broadly, the VPCI improved
profitability in the three size classes - small, mid, and large caps - and all four
style classifications - high and low volume, and high and low volatility. Nine of
the 12 subgroups showed improvement. The exceptions were mid cap highvolatility issues, and small and large low-volume issues. Of the 60 issues tested,
39 or 65%, showed improved results using VPCI. The VPCI group made
$381,089. This compares to the competing non-VPCI group making only
$169,092. Thus, overall profitability was boosted by $211,997 with VPCI.
Figure 6. Profitability Improvement with VPCI
Reliability
Risk-Adjusted Returns
For instance, one issue may generate $40,000 in losses and $50,000 in gains
whereas a second issue may generate $10,000 in losses and $20,000 in gains.
Both issues generate a $10,000 net profit. However, an investor could expect to
make $1.25 for every dollar lost in the first system, while expecting to make $2
for every dollar lost in the second system. The figures of $1.25 and $2 represent the profit factor. Even more significant improvements across all size, volume and volatility groups again were achieved using the VPCI. Of the 12 subgroups, only large cap low-volatility issues did not show an improvement with
the VPCI. Overall, the profit factor was improved by 19%, meaning one could
expect to earn 19% more profit for every dollar lost when applying VPCI to the
trading system.
Figure 8. Profit Factor Improvement Using VPCI
Among the 12 Subgroups
Conclusion
The VPCI reconciles volume and price as determined by each of their proportional weights. This information can be used to confirm or deny the likelihood of a current price trend continuing. This study clearly demonstrates that
adding the VPCI indicator to a trend-following system results in consistently
improved performance across all major areas measured by the study. As a
maestros baton in the hands of a proficient investor, the Volume Price Confirmation Indicator is a tool capable of substantially accelerating profits, reducing
risk and empowering the investor to more reliable investment decisions.
Footnotes
i
BIOGRAGPHY
Other Applications
The raw VPCI calculation may be used as a multiplier or divider in conjunction with other indicators such as moving averages, momentum indicators,
or price and volume data. For example, if an investor has a trailing stop loss
order set at the five-week moving average of the lows, one could divide the
stop price by the VPCI calculation. This would lower the price stop when price
and volume are in confirmation, which would increase the probability of keeping an issue demonstrating price volume confirmation. However, when price
and volume are in contradiction, dividing the stop loss by the VPCI would raise
the stop price, preserving capital. Similarly, using VPCI as an add-on to various other price, volume, and momentum indicators may not only improve reliability but increase responsiveness as well.
23
Notes
ates
noates
noates
noatesnoates
noates
noates
noates
noates
noates
noatesnoates
noates
noates
24