You are on page 1of 25

JOURNAL

of
Technical
Analysis
Issue 62

Summer-Fall 2004

SM

Market Technicians Association, Inc.


A Not-For-Profit Professional Organization

Incorporated 1973

JOURNAL of Technical Analysis Summer-Fall 2004 Issue 62

Table of Contents
This issue of the Journal of Technical Analysis includes the most recent Charles H. Dow Award winner, Jason Goepfert.
His study is an update on work that has been done before by others but is extremely detailed and comprehensive, and adds
to our knowledge of how mutual fund cash positions reflect investment management opinion. Our old friend, and prior
editor of the Journal, Henry Pruden with two French professors, Bernard Paranque and Walter Baets, continue in their
discussion of a possible explanation for the connection between behavioral finance and stock market behavior. As always,
they introduce many new ideas to think about. And finally, we have an article by Buff Dormeier in which he devises a new
way to look at volume and price action together that appears to have some predictive ability.
Charles D. Kirkpatrick II, CMT, Editor

Journal Editor & Reviewers

The Organization of the Market Technicians Association, Inc.

Behavioral Finance and Technical Analysis


Interpreting Data From an Experiment on Irrational Exuberance, Part B:
Reflections from Three Different Angles

Henry O. Pruden, Ph.D.; Dr. Bernard Paranque; Dr. Walter Baets

2004 Charles H. Dow Award Winner


Mutual Fund Cash Reserves, the Risk-Free Rate and Stock Market Performance

Jason Goepfert

Introducing the Volume Price Confirmation Indicator (VPCI):


Price & Volume Reconciled

12

18

Buff Dormeier, CMT

JOURNAL of Technical Analysis Summer-Fall 2004

JOURNAL of Technical Analysis Summer-Fall 2004

Journal Editor & Reviewers


Editor
Charles D. Kirkpatrick II, CMT
Kirkpatrick & Company, Inc.
Bayfield, Colorado

Associate Editor
Michael Carr, CMT
Cheyenne, Wyoming

Manuscript Reviewers
Connie Brown, CMT
Aerodynamic Investments Inc.
Pawleys Island, South Carolina

J. Ronald Davis, CMT


Golum Investors, Inc.
Portland, Oregon

Kenneth G. Tower, CMT


CyberTrader, Inc.
Princeton, New Jersey

Julie Dahlquist, Ph.D.


University of Texas
San Antonio, Texas

Cynthia Kase, CMT


Kase and Company
Albuquerque, New Mexico

Avner Wolf, Ph.D.


Bernard M. Baruch College of the
City University of New York
New York, New York

Michael J. Moody, CMT


Dorsey, Wright & Associates
Pasadena, California

Production Coordinator

Publisher

Barbara I. Gomperts
Manager, Marketing Services, MTA
Marblehead, Massachusetts

Market Technicians Association, Inc.


74 Main Street, 3rd Floor
Woodbridge, New Jersey 07095

JOURNAL of Technical Analysis is published by the Market Technicians Association, Inc., (MTA) 74 Main Street, 3rd Floor, Woodbridge, NJ 07095. Its purpose
is to promote the investigation and analysis of the price and volume activities of the worlds financial markets. JOURNAL of Technical Analysis is distributed to
individuals (both academic and practitioner) and libraries in the United States, Canada and several other countries in Europe and Asia. JOURNAL of Technical
Analysis is copyrighted by the Market Technicians Association and registered with the Library of Congress. All rights are reserved.

JOURNAL of Technical Analysis Summer-Fall 2004

The Organization of the


Market Technicians Association, Inc.
Member and Affiliate Information

Journal Submission Guidelines

MTA MEMBER
Member category is available to those whose professional efforts are spent
practicing financial technical analysis that is either made available to the investing public or becomes a primary input into an active portfolio management
process or for whom technical analysis is a primary basis of their investment
decision-making process. Applicants for Membership must be engaged in the
above capacity for five years and must be sponsored by three MTA Members
familiar with the applicants work.
MTA AFFILIATE
MTA Affiliate status is available to individuals who are interested in technical analysis and the benefits of the MTA listed below. Most importantly,
Affiliates are included in the vast network of MTA Members and Affiliates
across the nation and the world providing you with common ground among
fellow technicians.
DUES
Dues for Members and Affiliates are $300 per year and are payable when
joining the MTA and annually on July 1st. College students may join at a
reduced rate of $50 with the endorsement of a professor. Applicants for Member status will be charged a one-time application fee of $25.

We want your article to be published and to be read. In the latter regard, we


ask for active simple rather than passive sentences, minimal syllables per word,
and brevity. Charts and graphs must be cited in the text, clearly marked, and
limited in number. All equations should be explained in simple English, and
introductions and summaries should be concise and informative.
1. Authors should submit, with a cover letter, their manuscript and supporting
material on a 1.44mb diskette or through email. The cover letter should
include the authors names, addresses, telephone numbers, email addresses,
the article title, format of the manuscript and charts, and a brief description
of the files submitted. We prefer Word for documents and *.jpg for charts,
graphs or illustrations.
2. As well as the manuscript, references, endnotes, tables, charts, figures, or
illustrations, each in separate files on the diskette, we request that the
authors submit a non-technical abstract of the paper as well as a short
biography of each author, including educational background and special
designations such as Ph.D., CFA or CMT.
3. References should be limited to works cited in the text and should follow
the format standard to the Journal of Finance.
4. Upon acceptance of the article, to conform to the above style conventions,
we maintain the right to make revisions or to return the manuscript to the
author for revisions.

Members and Affiliates

have access to the Placement Committee (career placement)


can register for the CMT Program
may attend regional and national meetings with featured speakers
receive a reduced rate for the annual seminar
receive the monthly newsletter, Technically Speaking
receive the Journal of Technical Analysis, bi-annually
have access to the MTA website and their own personal page
have access to the MTA lending library
become a Colleague of the International Federation of Technical Analysts
(IFTA)

Please submit your non-CMT paper to:


Charles D. Kirkpatrick II, CMT
7669 CR 502
Bayfield, CO 81122
journal@mta.org

JOURNAL of Technical Analysis Summer-Fall 2004

Behavioral Finance and Technical Analysis


Interpreting Data from an Experiment on Irrational Exuberance,
Part B: Reflections from Three Different Angles
Professors Walter Baets, Bernard Paranque and Henry Pruden,
Euromed-Marseille cole de Management
In Part A of Interpreting the Findings of An Experiment on Irrational Exuberance... the authors organized their analyses around a positive theory of
behavioral finance and the nominal theory of technical market analysis rules.
The behavioral finance model for structuring the data of the experiment was
the Cusp Catastrophe Model of non-linear behavior. The nominal model, based
upon the data exposed by the positive model, was a group of four technical
market analysis principles and one mental discipline/trading strategy discipline.
This article, the Part B article of the series, seeks to extend the interpretations of the findings. But rather than the single tightly structured theme of Part
A, this time the Part B calls upon the varied talents of all three co-authors.
This article takes advantage of the international, cross-cultural and multiple
disciplines represented by the three authors. Hence, each of the three co-authors was asked to analyze and re-interpret the experimental evidence from his
particular professional discipline. Thus, the three co-authors reflect interpretations from three different angles.
The first subsection of this article by Walter Baets reflects his area of discipline, which is complexity and knowledge management. Dr. Baets reflected
upon the behavior that gave rise to the price behavior generated by the experiment. Dr. Baets provides a broad perspective upon behavioral finance notions
and he gives a penetrating look into the structure behind the actions of the
student traders in the Cal Tech Experiment. Dr. Baets considers SWARM-like
theories to explain the emergent behavior generated by the individualistic, selfserving behavior of interacting agents, the students in the experiment.
The second sub-section by Professor Bernard Paranque reflects his formation as a doctor of economics and his responsibilities as Head Finance and
Information Department. Dr. Paranque sets forth reflections upon the experiment described in Part A that touch upon the sensitive and vital but often
unexamined issue of risk and welfare and for all market participants. This
viewpoint stands opposite to the self-serving behavior of the few elite traders
who could have exploited the cusp and profited from the decline using technical analysis tools. In other words, Dr. Paranque takes up the challenge of
examining the ethical dimension of behavioral finance and technical market
analysis. His point of departure is the greater fool theory operating during
the experiment.
Pruden takes a pragmatic yet artistic approach to the extraction of more
information from the technical analysis rules that were used to interpret the
laboratory data. And that could have been used by the astute, elite trader to
exit the market in advance of the crash in prices.
The third section by Pruden reflects upon the four technical rules of principles that were applied in the Part A article. Pruden in Part B seeks to extend
the analytical capacity each of the four technical rules or principles by carrying them into the realm of Sequential Art. These were the tools that could have
been employed by the astute, elite trader to identify the cusp in time to avoid
the catastrophic crash. His goal is to offer ideas and techniques for extracting
even more information from the data found in the laboratory experiment.
Extensions stimulated by notions from Sequential Art 1may offer value added
to technical analysts in general.

Sub-Part One by Dr. Walter Baets


Agent Behavior and SWARM-Like Theory
There is no path; you lay down the path in walking Machady, a Spanish Poet

In the part A article, we observed a clustered, rather linear and persistant


behaviour of actors/agents. In fact, an interesting observation is that the model
visualizes the emergence of a certain kind of local stabilities (probably comparable to what is known as attractors in complexity theory), before something
like a (belief?) shift takes place moving the agents and the system into what
could be called discontinuous behavior. The model indeed visualizes emergence of interacting agents, yet it does not allow us to gain insight in the mechanism of the construction of the phenomenon it describes. If we wish to take
this argument further and get a deeper understanding of both the market
behaviour and specifically the role of the interacting agents, we should go deeper
into theories that are emergent in nature and simulate agent-based behavior.
Commonly known are SWARM-like theories.
Behavioural Finance and Technical Analysis point out the coordination problem of the agents action. This is commonly accepted, but the coordination
problem of the agents behavior has been studied under a certain (widely accepted) ontological and epistemological assumption of causality, i.e. that reality
is based on a causal interaction between variables, independent of the emotional
aspects of human agents. The causal approach is based upon the assumption of a
larger part of cognitive psychologists that still consider the mind as a processor
(a computer) of information that is caught outside the person. It denies that the
observer creates his own reality while observing, and it denies the fact that market behaviour also includes the significant interaction of agents (and their respective behaviours). A consequence of this ontology is that only what can be
measured could be managed, and broader, only what can be observed exists. It is
this rational, reductionist view on human behaviour that we often find in technical analysis. Within this ontological and epistemological choice, causality makes
sense, and (knowledge) engineering approaches should be able to give answers
to issues of market behaviour. Knowledge engineering techniques have been
extensively used in order to construct market analysis tools.
Keep in mind that we classically talk about emotions and psychology, but
always and only within the above described ontology. That seriously limits our
view and hence what we will eventually observe. In order to observe differently, we have to investigate first the ontology behind our thinking. An alternative ontology that increasingly gains attention is the one based on what neurobiologists (Maturana and Varela) call an enacted and embodied view of cognition. This ontology is one that is based on the acceptance that the observer
creates himself the reality which he observes. There is no reality, but it is
created as you observe. This concept would indeed allow us to explain what
we might call non-rational behaviour of traders, for instance. Traders are not
irrational, but they can only observe the reality which their experience (and
their learning) allows them to observe. This ontology gives power to the individual agents interacting in a network that all together co-create (in a dynamic
process) reality. In fact this ontology is an emergent one in which knowledge
and behaviour are continuously created via interaction and hence cannot be
anticipated using top-down causal models. Indeed, this ontology is not based
on causal relationship, but rather on synchronicity (being-together-in-time). We
will get back to that later. For the time being, and in order to understand the

JOURNAL of Technical Analysis Summer-Fall 2004

essence of agents behaviour, we make the choice that reality is created via the
interaction of individual agents that create emergent behaviour. Using the words
of the famous Spanish poet Machado: there is no path; you lay down the path
in walking.
What do we understand by enacted and embodied cognition, within an
autopoetic system? An autopoetic system is a concept out of neurobiology that
describes behaviour of any neurobiological colony, including hence human
behaviour. An autopoetic system is one that organises and reproduces itself in
such a way that is ideal for survival. The human body is an excellent example
of an autopoetic system. Cells in the body continuously reproduce to allow the
body to survive. Furthermore, the body is completely self-organised. Within
such a system we can identify a mind (say an individuals mind) that is embodied, which means that it is not just embrained (the computer metaphor) but
literally distributed through the body via the sensors (the human senses) in
continuous contact with its environment.
The environment co-creates the mind. Cognition that eventually will lead
behaviour is then enacted. Enaction has two dimensions: action and shaping.
Therefore cognitive action always contains these two components: action and
creation. All the rest is information. We hit a common misunderstanding between knowledge and information. Information is static (and linear) and therefore can be copied and repeated, whereas knowledge is dynamic (and nonlinear) and therefore needs to be created each time over and again. Complexity
theory (Nicolis and Prigogine) has proven that perspective to us over the last
30 years. The enacted view on knowledge (and behaviour) allows us to explore
models that have creative force and show emergent behaviour.
An often made assumption, that we presume is too limited, is that rational
(human) behaviour could only be causal (based on the hidden ontological assumption described above). If it is causal one can write it down in equations
that in turn would drive reality. If we really believe in behavioural theories,
then let us take this to its finality: agent theory.
For clarity sake, we have already touched upon a few concepts of complexity theory (dynamic non-linear systems behaviour) that shed a completely different light on market behaviour (Baets, 1998a and b). Systems are autoorganisational, based on an embodied mind and on enacted cognition. Systems
and knowledge are each time over and again re-created (which is by the way
what our brain does, since it is the most efficient way of organisation). Reality
is not Newtonian (fixed time-space concept) but emergent (co-created in interaction). In my habilitation thesis I have called that The quantum structure of
business (Baets, 2004). Complexity theory goes much further, but for the
purpose of our argument, we can leave it here.
An interesting development, based on this complexity theory, is what we
know as artificial-life research (Langton) and one of its further developments,
i.e. agent based simulations (Holland). Agent based simulations is a development in artificial intelligence, that, different from what AI is unfortunately still
known for, i.e. expert systems, in that it exposes learning behaviour. Indeed
agent simulations are based on interaction of individual agents, that have individual qualities and purposes, and that agree upon a minimum set of interaction
rules. Behaviour is clearly dynamic and produced in the continuous interaction
of agents that exchange information with each other. The least one can say is
that this is very much like human behaviour, particularly in financial markets.
Where as catastrophe theory implies a time dimension, agent-based simulation
gives due importance to what Prigogine calls the constructive role of time.
Each instance we bring in the arrow of time, let us say the constructive role of
interaction, behaviour gets created; it literally emerges.
This view supposes a number of interacting agents, within a specific field
(of action) having each their personal qualities and goals and following a minimum set of interaction and exchange rules. The question then becomes how
such a complex system could come to a coherent state. Most suggestions go in
the same direction. Varela suggests resonance as the mechanism; Sheldrake
suggests morphogenetic fields: sense is made out of interaction in a non causal
way. This mechanism of resonance is what occurs in SWARM like societies
(Epstein and Axtell, 1996). In fact we are talking here of agent theories. In
6

agent theory, as already suggested we only have to identify the playing ground
(let us say a particular financial market) and a number of agents. Each agent is
autonomous in achieving his goal(s) and is of course gifted with qualities (like
experience, information, human characteristics). Those agents interact with
each other based on a minimum number of interaction rules. Those rules govern the behaviour in the simulation, but they also define the learning of the
different agents. Agents, then translating learning into (new) action, co-create
in interaction with each other, continuously new (and adapted behaviour). Indeed, in such a market the path is layed down in walking, just as reality
happens to be in financial markets.
The argument takes Catastrophe theory one step further to its intrinsic ultimate claim, i.e. that time plays a constructive role in (market) behaviour. In our
own research (Baets 2004 and 2005) a number of projects are undertaken using
agent theories, but not yet in the financial markets. Agent theories have been
successfully used to visualise the emergence of innovation in a large consumer
goods company. That was to visualize emergent market behaviour identifying
an adapted market introduction strategy; and also to study emergent states in
conflict handling.
The basic question does lead us back to the ontological choices we discussed earlier. Once we accept complexity theory as a promising paradigm we
cannot avoid the question of causality. Quantum mechanics has given the world
a tremendous dilemma. How is it possible that two photons moving in different
directions, still keep in instantaneous contact. As Pauli (Van Meijgaard), amongst
others, suggests, there should be indeed interaction in a non-local field. Things
seem to occur at the same time without having any causal relationship. It is
this quantum structure of (financial markets) that deserve our attention in order
to improve our understanding of market behaviour (Baets, 2004).

References

Baets W, 1998 a, Organizational learning and knowledge technologies in a


dynamic environment, Kluwer Academic
Baets W, 1998 b, Guest editor of a special issue of Accounting Management
and Information Technologies, Complex adaptive systems, Elsevier
Baets W, 2004, Une interprtation quantique des processus organisationnels
dinnovation, thse de HDR, IAE Aix-en-Provence
Baets W (ed), 2005, Knowledge Management: beyond the hypes, Kluwer
Academic, fothcoming
Epstein J and Axtell R, 1996, Growing Artificial Societies, MIT Press
Holland J, 1998, Emergence from chaos to order, Oxford University Press
Langton C (Ed), 1989, Artificial Life, Santa Fe Institute Studies in the
Sciences of Complexity, Proceedings, Vol 6, Addison-Wesley
Maturana H and Varela F, 1980, Autopoiesis and Cognition: the realization
of the living, Reidel
Maturana H and Varela F, 1992, The tree of knowledge, Scherz Verlag
Nicolis G and Prigogine I, 1989, Exploring complexity, Freeman
Sheldrake R, 1995, The presence of the Past, Park Street Press
Sheldrake and Bohm, 1982, Morphogenetic fields and the implicate order,
ReVision
Van Meijgaard H, 2002, Wolfgang Pauli Centennial 1900-2000, PhD thesis
TU Twente

Sub-part Two by Dr. Bernard Paranque


Behaviorial Finance and Technical Analysis
Behavioral Finance and Technical Analysis point out the coordination
problem of the agents actions. More precisely, the question is the effect of the
action of certain types of agents on the collective welfare and, by consequence,
how to mitigate the negative consequences. There are three main answers: the

JOURNAL of Technical Analysis Summer-Fall 2004

first is through the laws and other professional rules such as the SEC, Basle and
so on; the second is the availability of tools allowing actors to avoid the problems as the one in the article quoted above; the last is at the level of individual
and his/her own capacity to take into account the collective interest or so called
social welfare. Neoclassical economic theory says that, under specific hypothesis, market, and particularly financial market, is the best way to ensure the
right allocation of resources. But, since that hypothesis is never verified, we
need others tools to manage the market. We need to have tools to help us. We
wont speak about laws and regulations rules as Basle2 are prepared to impose.
Rather, we will focus on individual behavior.
A lot of criticism point out the myopia of the agents, the mimetism of their
decision, even these attitudes are the cause of the breakdown in the distribution
of social welfare. When one has lost his confidence in the others or, more
relevant, one decides to change because he is able to influence the market (in
fact it is the only way to win: because the winner needs a lot of losers, that
mean a lot of followers). It could be possible to anticipate the breakdown if we
are able to identify the the main proxies of these strategies, as was demonstrated in our article (Part A). I feel that our individual social responsibility is
of true significance. This responsibility cant be assumed without clear rules of
actions. I would like to take an example from the work of Jensen.
In an article published in October 2001, Jensen highlighted the operational
limitations of the prevailing interpretation/use made of value maximization and
the stakeholder theory. He engaged then a criticism of the central model of
entrepreneurship with the polar figure of the manager and the shareholder. In
addition, he wanted to introduce other stakeholders of the firm.
Without it being explicitly stated, it seems that the different financial scandals may have a bearing on the desire to explain the operating conditions proposed by the maximization versus stakeholder theories, which are, in some
ways competing, and in other ways complementary.
On the one hand, it is argued that value maximization for the shareholder,
with all the problems this type of monitoring entails, remains the best way to
attain social welfare in a market economy. On the other hand, stakeholder theory
stresses the need to take into account the interests of all of the stakeholders in a
firm, including the customers, all of the suppliers, and the employees. According
to Jensen, the complementarities of the two theories stems from the need to understand value maximization from a collective point of view: social welfare is
only achieved when, all of the value contributed by each of the stakeholders is
maximized, and when this maximization of value occurs over the long term. The
result is that the firm is recognized as a historical and complex organization.
However, an operational problem arises if managers are expected to maximize value thus defined in that there is no reason why the objectives of the
various stakeholders should coincide. This criticism is valid both from the
point of view of value maximization (how can several objectives be managed
simultaneously?) and that of stakeholder theory (how is a common objective to
be defined?).
In fact, if Jensen recognizes the relevance of the stakeholder theory, he
underlines a problem. This theory is not able to answer the question about how
to manage several aims which could diverge. He says, before managing the
firm and maximizing its value and taking into account the wishes of the stakeholders, there is the need to obtain an agreement; on the one hand about the
hierarchies of the aims, and on the other hand about the modalities of their
accomplishment and the monitoring of the performances of the firm.2
The agreement is the core of the deal and of the future performance because
it will determine the managers value maximization strategy, in particular in
the field of the organization of the firm. For the supporters of the stakeholder
theory there is a tool, the balanced scorecard, but, in accordance with Jensen,
they say nothing about the necessity to obtain beforehand agreement on the
objectives from every participant involved in the firm and then, on the way, to
build common rules to play by.
This concern, means the social welfare implies to deal with des problmes
dinformations, danticipation et dvaluation (Salais and alii, 1986, p 193).
In fact, at a collective level but also at an individual level, we need to agree on

a common reality, not only to build it but also to agree to act together in this
perspective: Lenjeu de ces ngociations est le modle dinterprtation retenir
pour construire la ralit qui se prsente eux [les agents] comme problme
rsoudre (id, p 197-198). In others words, this necessary negotiation expresses a convention through which laccord des agents sur leur description
du monde et leur [permettant] ainsi de coordonner leurs projets (id, p236) is
approved. That kind of agreement repose sur des processus sociaux
dlaboration de modles de reprsentation de la ralit (id, p239).
Then, the question is how to manage this agreement at a collective level
and at an individual level. We need to identify specific coordination principles
on which we can obtain an agreement from the stakeholder and the availability
of specific tools given the opportunity to manage the collective behavior by
anticipating the risk of breakdown thats meaning the behavior of the one who
does not play with the same aim. But, it is not possible to negotiate this kind of
agreement without discussing the relevance of criteria of management and the
sense of performance, and then the different meaning between the stakeholders. For example, from the workers point of view, the starting point must be the
value added and not the EBITDA or the cash flow, because the value added is
the condition of their wages, despite the fact that the wages have an influence
on the profit.3
In total, entreprendre avec efficacit suppose de matriser lincertitude relative aux marchs, aux technologies et aux produits futurs, la cohrence de ses
propres projets par rapport ceux des autres agents, partenaires ou concurrents. (id,p246).
Nevertheless, the main point is the coordination of the agents behaviour
which deal with the uncertainty management.
Dans un contexte de relations aux autres dont on ne peut faire abstraction, lincertitude tenant la personne doit tre comprise comme une incertitude communicationnelle. Cependant, cette dsignation est elle-mme ambigu,
car elle pourrait laisser penser que lincertitude se rsume un problme de
circulation de linformation, une imperfection. Or une information ne peut
circuler que si elle a t au pralable labore dans un langage commun et que
si, par consquent, elle peut sajuster de part et dautre dans un dispositif qui
lui soit congruent (par exemple, la prsence de codes identiques) (Salais,
Storper 1993, pp 76-78) .
(En anglais: Wages regulation system (la forme salaire in French) maintains workforce unaware of the work that has been achieved (page 255 Salais
et alii) to the extent that the accomplished work is revealed through the produced value once the intermediary consumptions have been paid, namely the
added value (see page 227 as well as the written work by Paul Boccara on the
subject at hand, 1985).

References

Boccara, P. (1985), Intervenir dans les gestions avec de nouveaux critres,


Editions Sociales;
Jensen, M C. (2001), Value Maximization, Stakeholder Theory, and the
Corporate Objective Function (October). Unfolding Stakeholder Thinking,
eds. J. Andriof, et al, (Greenleaf Publishing, 2002). Also published in JACF,
V. 14, N. 3, 2001, European Financial Management Review, N. 7, 2001 and
in Breaking the Code of Change, M. Beer and N. Norhia, eds, HBS Press,
2000. http://ssrn.com/abstract=220671
Paranque, B. (2004, Toward an Agreement (February). Euromed Marseille
Ecole de Management Paper No. 11-2004. http://ssrn.com/abstract=501322
Storper M. and Salais R. (1997), Worlds of Production : the action framework
of the economy, Harvard University Press, Londres;
Salais R. et Storper M. (1993), Les mondes de production, cole des Hautes
tudes en Sciences Sociales, Paris;
Salais R., Baverez N., Reynaud B. (1986), Linvention du chmage, PUF,
(1999, dition PUF Collection Quadrige).

JOURNAL of Technical Analysis Summer-Fall 2004

Sub-part Three by Henry O Pruden, Ph.D.


Chart Analysis as Sequential Art
The technical analysis interpretation of the data from the Cal Tech Experiment on Irrational Exuberance found in the Pruden, Paranque and Baets article
(Part A) invites critical reflection. The very nature of pattern recognition requires good judgment by the analyst in isolating and interpreting appropriate
and significant portions of chart data. Technical tools such as trend lines are
extremely useful in separating out portions of chart data for further analysis.
But as observed in our article, Part A, the technical patterns identified involved comprehensive and varying perspectives. That interpretation was an
art form. Technical market analysis of charts can perhaps be enriched and
made more reliable through an understanding and application of the principles
of Sequential Art.4
In this section I propose to revisit four of the technical analysis rules or
concepts that were presented in Part A. These will be re-interpreted with the
aid of principles and patterns adopted with modification from the notions of
Sequential Art. The reader will thus be given an opportunity to reflect upon the
value added made possible for his/her applications of chart analysis and pattern
recognition.
The four technical tools applied in Part A to be reviewed here in Part B are:
1. Fear vs. Greed Juxtaposed
2. Trading Range Channels Along Tops and Bottoms
3. Descending Price Peaks
4. Catastrophic Panics Causing Price Gaps
Graphics of each of these rules are contained in the Appendix to this Part B
article.
1. FEAR VS. GREED JUXTAPOSED

In the original article (Part A) expressions of fear were observed as growing and expressions of greed were observed as shrinking as the market price
behavior neared the breakdown, the catastrophe jump point. One principle of
Sequential Art is that of the interdependance between the sounds (musical notes)
and the visual indications shown on the charts. The sound (words, musical
notes) and the marks on the chart picture go hand in hand to convey the idea of
changing market sentiment that neither could convey alone. Here it can be seen
the importance of how sounds, words, musical notes and pictorial indicators
support each others strengths. This gives rise to the suggestion that technical
analysis ought to include the careful annotation of junctures where it can be
seen that behavior is changing on a chart. Please notice how poignant the use
of icons, in this case musical symbols for high and low offers and bids that
convey the changing juxtaposition of fear vs. greed. The combination of sound
and visual clues also suggest a superior means of conveying the distinct characteristics of a sentiment indicator; they are a fine way of communicating the
emotional content of the information.
2.) TRADING RANGE CHANNELS ALONG TOPS AND BOTTOMS

The data shown in the trading range provides a good opportunity to cover
the essentials of Sequential Art. First there is the ideal purpose, which in the
case of trend channels is to outline the expected future course of price behavior... the idea is to define and extrapolate. The form employed was that of
displaying in graphic panels price behavior over time, which is to say, to create
a chart. A different type of form could have been the depiction of the trend
through averaging and simplifying the data into a moving average. Thus the
technical analyst as a practical artist has choices to make in the selection of
form.
Another artful choice is the structure of the sequencing of chart data over
time. A trend channel only makes sense if it has a beginning and an ending. In
the case of the Cal Tech Experiment there were three structural sections: the
trend of prices that reflected the progress, the growth, of speculation. Then
there was a section labelled the dissipative gradient. That panel too could have

been made even more distinctive off through a change in colors, say from green
toyellow. Then the third section that could have been framed was the panic/
crash in prices, and this third panel could have been further seperated with the
addition of the color red.
The artist / technician would thus set up a sequence of meaningful parts or
patterns that taken together would tell the story of boom and bust upon the
surface of the market. At a deeper level of analysis the separation into three
distinct yet interdependant sequential panels fits with the technicians vocabulary and the iconography of chart patterns that have been established through
experience to communicate meaning as to the present position and probable
future trend of a market. The separations into a sequence of separate panels
really clarify the picture and tell the story of the market.
3. DESCENDING PRICE PEAKS

In this case we cut incisively into the available chart to abstract a sequential
order of events that the technical analyst then moulds into a pattern. An icon for
symbolizing the motion into the future and in the downward direction defined
might be the famous abstract art depiction of Marcel Duchamps A Nude Descending a Staircase. Indeed the entire cubist movement in visual art might
be a rich area for a technical analyst to study. The parallels of the cubist art
form and the technical analyst are strong and suggest that borrowing from art
depicting motion might pay the technical analyst a large dividend. Among other
things the analyst can become sensitized connecting the dots of the descending price peaks reveals a picture plane and gives closure to the unifying properties and makes the viewer more aware of the design, the trend, as a whole rather
than simply the individual components. This in effect is the beauty of trendlines.
In support of simple trendlines it has been observed that Duchamp, more concerned with the idea of motion than the sensation, would eventually reduce
such concepts as motion to a single line.5
The moment-to-moment and action-to-action progression of the abstraction of triple descending peaks, does not require too much involvement by
the viewer to interpret the meaning. It is clear-cut, decisive and powerful. Furthermore, the actions and intentions of the buyers and sellers which underly the
descending peaks lend themselves to a common sense interpretation for grasping the implication of descending price peaks. Implicit in the foregoing conclusion is the realization that effective chart interpretation involves the analyst
who identifies and frames sequences and then the observer who reads and internalizes the sequences of panels and labels to inform himself/herself of the
motion revealed and the action required.
4. CATASTROPHE PANICS CAUSING PRICE GAPS

The discontinuity of price transaction behavior creates visual gaps or gutters that separate panels of price action. It is the acute imbalance between
supply and demand which creates those gaps. Sequential Art identifies these
gaps or separations where nothing has been recorded as gutters. What attracts
the analysts attention is the comparison of panels of price action before and
after a gap. Why? Because the comparison has forecasting implications. The
gaps or gutters fracture both time and space offering a jagged, staccato rhythm
of unconnected moments. But the observers ability to construct continuity across
panels generates the ability to mentally create closure. Like Sequential Art,
this illustration of gap analysis reinforces that technical market analysis of chart
behavior is very much an interplay between the observer and the observed.

Footnotes
1 Scott Mc Cloud, Understanding Comics: The Invisible Art, The Kitchen
Sink Press CA Division of Harper and Collins) 1993
2 Those interested may read a comment, in french, in Paranque (2004).
3 La forme salaire maintient les salaris dans la mconnaissance du travail
accompli (page 255 Salais et alii) dans la mesure o ce travail accompli
sexprime dans la valeur produite une fois les consommations intermdiaires
payes, savoir la valeur ajoute ( voir page 227 et les travaux de Paul
Boccara sur le sujet, 1985).

JOURNAL of Technical Analysis Summer-Fall 2004

4 Scott Mc Cloud, Understanding Comics: The Invisible Art, The Kitchen


Sink Press CA Division of Harper and Collins) 1993
5 Mc Cloud, page 108.

Figure 4. Applying Technical Analysis

Appendix
Figure 1. A Cusp Catastrophe Model of a Stock Exchange

Figure 5. Fear vs. Greed Juxtaposed

Figure 2. Dissipative Gradient

Figure 3. The Overall Results of the Experiment

Figure 6. Trading Range

JOURNAL of Technical Analysis Summer-Fall 2004

Footnotes

Figure 7. Descending Prices Peaks

1 Scott Mc Cloud, Understanding Comics: The Invisible Art, The Kitchen


Sink Press CA Division of Harper and Collins) 1993
2 Those interested may read a comment, in french, in Paranque (2004).
3 La forme salaire maintient les salaris dans la mconnaissance du travail
accompli (page 255 Salais et alii) dans la mesure o ce travail accompli
sexprime dans la valeur produite une fois les consommations intermdiaires
payes, savoir la valeur ajoute (voir page 227 et les travaux de Paul
Boccara sur le sujet, 1985).
4 Scott Mc Cloud, Understanding Comics: The Invisible Art, The Kitchen
Sink Press CA Division of Harper and Collins) 1993
5 Mc Cloud, page 108.

About the Authors


Figure 8. Catastrophe Panic Causing Price Gaps

Figure 9. Mental Discipline Needed to Win the Greater


Fool Game

10

DR. WALTER BAETS


Walter R. J. Baets is Director Graduate Programs at Euromed Marseille
- Ecole de Management and Distinguished Professor in Information, Innovation and Knowledge at Universiteit Nyenrode, The Netherlands Business
School. He is also director of Notion, the Nyenrode Institute for Knowledge Management and Virtual Education. Previously he was Dean of Research at the Euro-Arab Management School in Granada, Spain. He graduated in Econometrics and Operations Research at the University of Antwerp
(Belgium) and did postgraduate studies in Business Administration at
Warwick Business School (UK). He was awarded a Ph.D. from the University of Warwick in Industrial and Business Studies.
He pursued a career in strategic planning, decision support and IS
consultancy for more than ten years, before joining the academic world,
first as managing director of the management development centre of the
Louvain Universities (Belgium) and later as Associate Professor at Nijenrode
University, The Netherlands Business School. He has been a Visiting Professor at the University of Aix-Marseille (IAE), GRASCE (Complexity
Research Centre) Aix-en-Provence, ESC Rouen, KU Leuven, RU Gent,
Moscow, St Petersburg, Tyumen and Purdue University. Most of his professional experience was acquired in the telecommunications and banking
sector. He has substantial experience in management development activities in Russia and the Arab world.
His research interests include: Innovation and knowledge; Complexity,
chaos and change; The impact of (new information) technologies on
organisations; Knowledge, learning, artificial intelligence and neural networks; On-line learning and work-place learning.
He is a member of the International Editorial Board of the Journal of
Strategic Information Systems, Information & Management and Syst mes
dInformation et Management. He has acted as a reviewer/evaluator for a
number of International Conferences (e.g. ECIS an ICIS) and for the EU
RACE programme. He has published in several journals including the Journal of Strategic Information Systems, The European Journal of Operations
Research, Knowledge and Process Management, Marketing Intelligence and
Planning, The Journal of Systems Management, Information & Management, The Learning Organization and Accounting, Management and Information Technologies. He has organised international conferences in the
area of IT and organizational change.
Walter Baets is the author of Organizational Learning and Knowledge
Technologies in a Dynamic Environment published in 1998 by Kluwer
Academic Publishers, and co-author with Gert Van der Linden of The
Hybrid Business School: Developing knowledge management through management learning, published by Prentice-Hall in 2000. Along with Bob
Galliers he co-edited Information Technology and Organizational Transformation: Innovation for the 21st Century Organization also published in

JOURNAL of Technical Analysis Summer-Fall 2004

1998 by Wiley. In 1999, he edited Complexity and Management: A collection of essays, published by World Scientific Publishing. Recently he
co-authored Virtual Corporate Universities, published 2003 by Kluwer
Academic.
DR. BERNARD PARANQUE
Bernard Paranque is a doctor of economics ( University of Lyon Lumi re
- 1984) and holds the Habilitation diriger les recherches (1995). He
began his career as an associate economist in an accountancy firm in 1984.
In 1990, he joined the Banque de France (French Central Bank) business department. From 1990 to 2000 he produced papers on the financial
structure of non-financial companies (www.ssrn.com). He was a representative of the Banque de France in the European Committee of Central Balance Sheet Offices between 1993 and 2002.
In 1999, he was on secondment from the Banque de France to the Secretary of State to SMEs where he was in charge of the business financing
department. He was also a member of the French delegation to the SMEs
working party of the Business and Environment Committee of the OECD.
His research refer to the conomie des conventions and are focused
on the financial behavior of the non-financial organization and the promotion of specific tools and assessment procedures designed to enhance SMEs
access to financing.
He is co-author with Bernard Belletante and Nadine Levratto of Diversit
conomique et mode de financement des PME published in 2001. He is
also the co-author of Structures of Corporate Finance in Germany and
France with Hans Friderichs in Jahrbcher fr National konomie und
Statistik, 2001.
He is associate researcher of the CNRS team IDHE-ENS Cachan in
Paris and member of the New York Academy of Science.
He joins Euromed Marseille Ecole de Management as Professor of Finance and Head of the Information and finance department.
DR. HENRY O. PRUDEN
Hank Pruden is a visiting scholar at Euromed Marseille Ecole de Management, Marseille, France during 2004-2005. Professor Pruden is a professor in the School of Business at Golden Gate University in San Francisco, California where he has been teaching for 20 years. Hank is more
than a theoretician, he has actively traded his own account for the past 20
years. His personal involvement in the market ensures that what he teaches
is practical for the trader, and not just abstract academic theory.
He is the Executive Director of the Institute of Technical Market Analysis (ITMA). At Golden Gate he developed the accredited courses in technical market analysis in 1976. Since then the curriculum has expanded to
include advanced topics in technical analysis and trading. In his courses
Hank emphasizes the psychology of trading and as well as the use of technical analysis methods. He has published extensively in both areas.
Hank has mentored individual and institutional traders in the field of
technical analysis for many years. He is presently on the Board of Directors
of the Technical Securities Analysts Association of San Francisco and is
past president of that association. Hank was also on the Board of Directors
of the Market Technicians Association (MTA). Hank has served as vice
chair, Americas IFTA (International Federation of Technical Analysts): IFTA
educates and certifies analysts worldwide. For eleven years Hank was the
editor of The Market Technicians Association Journal, the premier publication of technical analysts. From 1982 to 1993 he was a member of the Board
of Trustees of Golden Gate University.

JOURNAL of Technical Analysis Summer-Fall 2004

11

12

JOURNAL of Technical Analysis Summer-Fall 2004

Mutual Fund Cash Reserves, the Risk-Free Rate


and Stock Market Performance
Jason Goepfert
The most basic tenet of contrarian investing
is that one should buy when others are fearful
and sell when they are eager to buy.
The definitions of fearful and eager are
open to interpretation, but one assumption that
has persisted over the decades is that low levels
of cash reserves held at mutual fund firms was
a sign of excessive optimism. Looking at the
relationship between cash reserves and the riskfree rate of return, however, suggests that portfolio manager sentiment is not the only or perhaps even the largest component of cash reserve levels. By backing out the effects of interest rates, we can get a better feel
for the sentiment of these portfolio managers, as well as potential stock market
returns going forward.
Cash reserves at mutual
funds in the United States
reached historically low
levels in 2004. The
traditional interpretation
suggests that fund managers
were too optimistic on likely
future stock market gains.
But is the traditional
interpretation accurate?

This assumption is certainly supported by the numbers. From 1954 through


2003, the correlation between mutual fund cash levels and the 90-day T-Bill
rate was 0.74, which means that the prevailing level of interest rates can theoretically explain 55% of why mutual fund cash levels are where they are. Figure 1 shows this correlation - there is a clear upward slope to the scatter plot,
with minimal variation.
FIGURE 1
Correlation Between Cash Level and Risk-Free Rate
1954 - 2003

Cash Reserves vs. the Risk-Free Rate


As of June 2004, liquid assets of stock mutual funds, expressed as a percentage of total net assets, stood at 4.3%. This level of reserves, relative to total
assets, was one of the lowest in the history of reported data. At the time the
figures were released, there was a great deal of media attention focused on the
idea that fund managers in the United States were overly enthusiastic about the
prospect of future gains in the equities market, and thus the market was likely
going to have difficulty making significant advances. The logic of such an
argument may be sound, but a look into another, perhaps more important, factor sheds some light on why cash levels at mutual funds were so low.
There are many reasons why a fund would hold a low level of cash:
They believe the market is going higher and want to be as fully invested as
possible.
They use derivative securities (such as futures and options) and dont need
actual cash on hand in order to hedge their portfolios.
Their charter (or a mandate from investors or management) requires that
they remain as invested as possible, having enough cash on hand only to
meet expected redemptions. They are not expected to time the market, only
find good stocks. With the improved reporting systems now in place at some
fund firms, portfolio managers can see redemptions on virtually a real-time
basis, reducing the likelihood that they will wake up one day with a cash
crunch.
The increased influence of index funds precludes market timing. These
managers arent expected to give investors a positive absolute return, they
are only expected to beat their respective benchmark index. Having a high
level of cash increases their chances of underperforming their benchmark
in a rising market.
There arent many other instruments available that would give their investors
an acceptable reward for the risk they are taking.
It is on that last point that I wish to focus. When short-term interest rates are
high, mutual funds have an incentive to hold cash. If there is a risk-free investment that will pay an 8% return, is it unreasonable to expect a fund manager to
shift funds there as opposed to risking them in the equities market where they
may get an 8% return during a good year, but with a great deal more risk? Most
of us would surely switch to the risk-free opportunity. For these purposes, we
will use the yield on 90-day Treasury Bills as the risk-free rate of return.

With 591 data points, the probability of the correlation between cash levels
and interest rates being due to chance alone is essentially zero.
Using regression analysis, we can see the relationship between interest rates
and cash reserves. This will allow us to determine what an appropriate level
of cash reserves may be given a certain interest rate, which we can then use to
compare to current cash reserves. If current reserves are too low given prevailing rates, then fund managers may be overly optimistic; if they are too high,
then they may be excessively pessimistic.
Using data from 1954 through 2003, the regression formula for the relationship between interest rates and mutual fund cash reserves is:
y = 0.4978x + 4.5464
where
y = expected cash reserve
x = current rate on 90-day T-Bills
We can round off these figures and still retain the usefulness of the formula.
Put into different terms, the regression formula tells us that cash reserves during any given month should be approximately 4.5% plus 50% of the current
yield on 90-day T-Bills. Theoretically, if 90-day TBills were yielding 0%, then
mutual funds would be expected to carry 4.5% of their assets in liquid investments. This is a baseline amount of cash, presumably needed to cover expenses, redemptions and the like.

JOURNAL of Technical Analysis Summer-Fall 2004

13

Figure 2 uses the regression formula to show what percentage of cash we


would expect mutual funds to hold given a range of values in T-Bill Yields.
FIGURE 2
T-Bill Yields and Expected Cash Reserves
90-Day T-Bill Yield
1.0%
2.0%
3.0%
4.0%
5.0%
6.0%
8.0%
10.0%

Expected Cash Reserve


5.0%
5.5%
6.0%
6.5%
7.0%
7.5%
8.5%
9.5%

We know that in June 2004, cash reserves were at 4.3% of total assets. On
June 30th, the yield on 90-Day T-bills was 1.31%. By plugging that value into
the regression formula, we estimate that mutual funds should have carried 5.20%
of their assets in cash. By taking the difference between what was expected and
what was fact, we can conclude that mutual funds were carrying a cash deficit of 0.90%:
Actual
4.30%

Expected
5.20%

Surplus/(Deficit)
(0.90%)

By going back and comparing actual levels of cash to those that were expected given the prevailing level of interest rates, we can get a better handle on
the sentiment of portfolio managers without the distorting effects of interest
rates on cash reserves. The difference between actual and expected reserves
will show that fund managers are giving a premium or discount to cash,
and should create an effective contrary sentiment indicator. For purposes of
brevity, we will call the difference between actual and expected cash reserves
RAPAD (Rate Adjusted Premium And Discount).

Figure 3 shows how the S&P 500 performed for up to 2 years after mutual
funds were holding cash reserves that were at least 2.25% less than they should
have been given the level of short-term interest rates at the time. The primary
reason for giving cash such a discount was likely that the fund managers felt
very optimistic about the future gains they were likely to make in the stock
market, so they felt the need to be as fully invested as possible. As we can see
from the table, this optimism was generally unwarranted. If we look at the
results after 12 months, the S&P 500 showed an average return of -6.1%.
Looking at the months where cash levels were in a normal range (meaning RAPAD readings within 1.5 standard deviations of the mean), the average
12 month return in the S&P 500 was 8.7% during the study period. One-year
returns after extreme cash discounts therefore underperformed an average return by 14.8%. We also see from Figure 3 that the S&P 500 was higher 12
months later only 22% of the time. There were 36 months that were considered
to show an extreme cash discount, and only 8 times out of those 36 instances
was the S&P 500 higher one year later.
Figure 4 gives us the performance after periods of extreme cash premiums,
meaning those times when fund managers held at least 2.25% more cash than
expected. The results here are markedly different from Figure 3. After 12
months, the S&P 500 was an average of 14.1% higher, outperforming an average month by 5.4%. Out of the 53 months that qualified as exhibiting an extreme cash premium, 47 lead to a higher market one year later, for a success
rate of 89%. See Appendix A for a detailed list of all extreme RAPAD readings during the study period.
Figure 5 below shows the correlation between RAPAD readings and S&P
500 returns 12 months later.
FIGURE 5
S&P 500 12-MONTH RETURNS AND RAPAD READINGS

Adjusted Reserves as Sentiment Indicator


During the 49 years of the study, the mean value of the RAPAD measure
was 0.0%, with a standard deviation of 1.5%. The distribution of readings from
this measure hugs closely to a normal bell curve, so standard statistical measures should apply. If we look at how the market, defined as the S&P 500 cash
index, performed after abnormal readings, we can begin to get an idea of how
effective this measure may be at highlighting high- or low risk times in the
stock market. For these purposes, we are defining abnormal as any reading
more than 1.5 standard deviations away from the mean, which in this case would
equate to all RAPAD readings less than -2.25% and greater than 2.25%. Put
another way, we will see how the market performed after any month when
mutual funds held 2.25% more or less cash than they should have held given
the prevailing level of interest rates.
FIGURE 3
S&P 500 Performance After RAPAD Reading of -2.25% or
Below (Extreme Cash Discount)
Average Return
Percent Positive

6 Months
Later
-3.0%
31%

12 Months
Later
-6.1%
22%

18 Months
Later
-5.5%
36%

24 Months
Later
-1.8%
50%

FIGURE 4
S&P 500 Performance After RAPAD Reading of +2.25% or
Above (Extreme Cash Premium)

Average Return
Percent Positive

14

6 Months
Later
8.3%
81%

12 Months
Later
14.1%
89%

18 Months
Later
19.4%
98%

24 Months
Later
23.0%
100%

The correlation between RAPAD readings and returns in the S&P 500 one
year later is 0.32, suggesting that if we knew nothing else but what the current
RAPAD reading was, we could improve our prediction of where the S&P 500
would close one year later by about 11%.

Why Adjust for Interest Rates?


A valid question is why do we have to adjust for interest rates at all - arent
cash levels by themselves a good enough indicator of excessive optimism or
pessimism by fund managers? Monitoring cash levels on their own can indeed
be an adequate contrary guide. However, there have been times where adjusting for interest rates has given a much better indication of excess. Figure 6
highlights just such an instance.

JOURNAL of Technical Analysis Summer-Fall 2004

FIGURE 6
CASH LEVELS VS. RATE-ADJUSTED CASH LEVELS

Figure 6 shows us a plot of the S&P 500 (top scale), the raw values of
mutual fund cash reserves (middle scale) and the RAPAD measure of cash premiums and discounts (lower scale).
On the chart, Point A corresponds to July 1976. At the time, 90-Day T-Bills
were yielding about 5.2%. According to the regression formula, mutual funds
should have been holding about 7.1% of their assets in cash. However, they
were holding only 4.7% cash, so they were holding about 2.4% less cash reserves than they should have been given the level of short-term rates at the
time. This was a show of extreme optimism on the part of fund managers, and
the S&P refused to accommodate by declining into the beginning of 1978.
By early 1980, managers had built up their cash reserves once more, just in
time for a stiff market rally over the next year. In January 1981 (Point B on the
chart), 90-Day T-Bill rates had climbed all the way up to 14.6%, giving fund
managers a very enticing incentive to hold large amounts of cash. They did have
significantly more cash then than they did in 1976. At Point A, cash levels were
around 4.7%, as stated above. At Point B, cash levels stood at 8.3%. Taken on its
own, one could have easily concluded that fund managers were nowhere near as
optimistic at Point B than they were at Point A. However, when we factor in
prevailing interest rates, theoretically fund managers should have been holding
11.8% of their assets in cash at the time. Since they only had 8.3% in cash
reserves, they were once again deficient by an extreme amount (3.5%). This told
us that fund managers were indeed too optimistic, contrary investors should
have expected a market decline (or at least difficulty making much headway),
and the S&P ultimately declined sharply over the next one and a half years.

Out-of-Sample Testing and Other


Technical Analysis Applications
In order to get an idea of how this method would have worked in real-time
(without the perfect knowledge of hindsight), out-of-sample testing is necessary. This is where we take only a portion of the data as the look-back period
for the regression formula, and then test to see how it would have predicted
future moves in the S&P 500.
Using the period from 1954 through 1976 as the lookback period, the regression line between mutual fund cash levels and the 90-day T-Bill rate re-

mained quite consistent with what was presented above:


Y = 0.5336 + 4.0163
where
y = expected cash reserve
x = current rate on 90-day T-Bills
When we take this formula and determine the cash deficit or cash surplus
from 1977 through 2003 (the out-of-sample period), we can determine how
well it would have predicted future stock market returns. In Figure 5, we showed
the correlation between RAPAD readings and one-year S&P 500 returns as
being 0.32. Using this out-of-sample test, the correlation from 1976 through
2003 dropped to 0.21. However, given that correlation and the number of data
points in the sample, once again the chances are virtually zero that this relationship occurred by chance alone.
In Figures 3 and 4, we showed how the S&P 500 performed after the cash
premium or discount reached extreme levels. Taking the same approach with the
out-ofsample test, the results were very consistent. Here, extreme is considered to be any cash discount of -1.75% or less or any cash premium of +1.75%
or more. One year after extreme cash discounts, the S&P 500 was higher 25% of
the time, with an average return of -5.6%. One year after extreme cash premiums, the S&P 500 was higher 89% of the time, with an average return of 12.1%.
These results compare very favorably to those obtained previously, suggesting that the predictive power of this approach held up even during the outof-sample testing. As with most contrary indicators, the RAPAD measure became most effective when it was giving extreme readings one way or the other.
It may be possible to achieve similar or even superior market-timing results
by applying basic technical analysis to the cash levels themselves, without the
need to adjust for interest rates. To test this, we used a simple moving average
crossover system applied to the cash balances. We went long the S&P 500 cash
index when a 12-month average of cash balances fell below their 60-month
average and then sold when the 12-month average crossed back above the 60month average.
Such a system did have some merit, as it would have kept an investor out of
the bad markets of 1974 and 1987. It also would have kept one long during the
roaring bull market of the 1990s. However, as with most crossover systems,
whipsaws were an issue. Out of the 7 signals, 3 of them were losers, losing an
average of 9%. The four winners, however, gained an average of 76% (due
mainly to the 222% gain from the 1990s).
If we used a very simple RAPAD method of going long when RAPAD first
crossed above +2.25 and selling when it first crossed below -2.25 (so we would
be buying when mutual fund cash reserves first became extremely high, and we
would hold until they became extremely low), there would have been only four
trades by this strict methodology. All four were winners, for an average gain of
155% (skewed by a 450% gain from the system going long in October 1985 and
holding through March 1998). Since the data is released to the public with a
one-month delay, we used the S&P closing prices as of the date one would have
received the data, which reduced the returns somewhat but kept the trades much
more based in reality. See Appendix B for a chart of each of the trade signals.

Other Factors
In the beginning paragraph, we highlighted several other factors, besides
competing assets, which may affect mutual fund cash reserves. We have looked
at what relation some of those have on cash reserves, and there does seem to be
a correlation. However, since many of these developments are so new, we do
not have enough data to draw reliable conclusions. Still, it is instructive to
discuss the impacts of these variables on cash reserves so that we can more
readily observe their impact going forward.
The listed options market has grown steadily over the past 10 years. In
1993, the Chicago Board Options Exchange was clearing approximately
9,000,000 options contracts on a monthly basis. By the end of 2003, that volume had tripled. The correlation between monthly options volume on the CBOE

JOURNAL of Technical Analysis Summer-Fall 2004

15

and mutual fund cash levels from 1993 - 2003 was -0.66. This tells us that there
was a large negative correlation between option volume and cash levels - as
option volume increased, cash levels decreased. This could be a significant
factor, however we are limited by a lack of reliable option volume data. Also,
interest rates during this time were steadily decreasing. As we saw above, interest rates have had a definite impact on cash levels over nearly 50 years of data,
so it is difficult to determine if cash levels were impacted more by option activity or by interest rates.
We also checked the correlation between cash reserves and futures market
activity. For the latter, we used commercial trader positions (both long and short)
in the large S&P 500 futures contract from 1986 - 2003. According to the Commodity Futures Trading Commission (CFTC), a commercial trader is a large
trader (the definition of large has changed over the years) engaged in the futures market for the specific purpose of hedging the traders daily business activity. Comparing month-end positions in the Commitments of Traders report,
there was a correlation of -0.80 between the futures positions and mutual fund
cash reserves. This is a very tight correlation and tells us that as futures positions increased, cash reserves decreased and vice-versa. Once again, however,
we are limited by the fact that interest rates declined for most of this period.
It would be possible to use a multiple regression formula to determine where
mutual fund cash reserves should be, instead of using only interest rates as
described above. However, until more time goes by where we see varying levels of option and futures activity, the utility of that exercise is probably limited.
Another likely factor in cash balances is the impact of rising or falling market prices themselves, regardless of fund manager sentiment. When market prices
rise, we should see a decline in the percentage of assets held in cash, simply
because the total portfolio is worth more than it was before.
To check whether this may be the case or not, we looked at the month-tomonth percentage change in the S&P 500 and compared it to the month-tomonth change in mutual fund cash balances. The correlation was -0.38, which
means that it could be statistically possible for one factor to account for around
14% of the movement in the other. Since the correlation is negative, it helps to
confirm the theory that cash balances would fall when prices rise and vice versa.
The same negative correlation holds (although it falls to -0.20) when we look at
this months change in the S&P 500 and next months change in cash balances.
This is another significant factor and should be included in any discussion about
whether cash levels have moved an inordinate amount in any given period.
However, it is important to distinguish those times when funds are holding
low levels of cash because they are overly optimistic versus those times when
they are holding low cash reserves because there are few other alternatives.
With 90-Day T-Bill rates yielding barely above 1% at the time, June 2004 was
certainly one of the latter.
This does not mean the stock market cannot - or should not - decline, it
simply means that overzealous fund managers are not necessarily a catalyst. If
we see rates rise significantly in 2004, but cash reserves at mutual funds hold
steady or decline, then there may be some real evidence that fund managers are
excessively optimistic.
As we saw from Figure 3, overly optimistic portfolio managers are a good
sign that whatever rally is in place may be about to lose steam.

Recent Activity
As of June 2004, mutual funds held 4.3% of their total assets in liquid assets. Given the low level of short-term interest rates at the time, it is not entirely
unexpected that cash reserves would be so low. Still, anytime the absolute level
of cash is low, we believe investors should be worried. While fund companies
have better reporting systems now than they did 20 years ago, there is still the
possibility of a cash crunch, whereby unexpected redemptions cause heavy
selling by mutual funds to meet the redemptions since they do not have adequate cash on hand to cover them.
This of course would exacerbate the market decline that is likely the reason
for the redemptions in the first place.

16

Sources
Investment Company Institute (http://www.icinet.net/)
The Federal Reserve Bank of St. Louis (http://www.stlouisfed.org/)

Appendix A
The table below outlines each month where the RAPAD reading was considered extreme. The table gives the month of the occurrence, the S&P 500
cash index level at the time, the RAPAD reading for that month, and the return
in the S&P 500 cash index 6, 12, 18 and 24 months later.

Date

S&P 500

RAPAP

6 Mo.
Later

S&P 500 Return


12 Mo.
18 Mo.
Later
Later

24 Mo.
Later

All Occurrences with RAPAD Readings of -2.25 and Below (Extreme Cash Deficit)
1/30/81
129.55
-3.55
1.1%
-7.1%
-17.3%
12.2%
3/31/00
1498.58
-3.39
-4.1% -22.6%
-30.5%
-23.4%
4/30/81
132.81
-3.22
-8.2% -12.3%
0.7%
23.8%
7/31/81
130.92
-3.06
-8.0% -18.2%
11.0%
24.2%
5/29/81
132.59
-2.99
-4.7% -15.6%
4.5%
22.5%
2/27/81
131.27
-2.98
-6.5% -13.8%
-9.0%
12.8%
2/29/00
1366.42
-2.94
11.1%
-9.3%
-17.0%
-19.0%
8/31/81
122.79
-2.94
-7.9%
-2.7%
20.6%
33.9%
1/31/00
1394.46
-2.90
2.6%
-2.0%
-13.1%
-19.0%
8/31/00
1517.68
-2.90
-18.3% -25.3%
-27.1%
-39.6%
1/31/73
116.03
-2.84
-6.7% -16.8%
-31.6%
-33.7%
6/30/81
131.21
-2.82
-6.6% -16.5%
7.2%
27.8%
12/31/99 1469.25
-2.82
-1.0% -10.1%
-16.7%
-21.9%
9/30/71
98.34
-2.81
9.0%
12.4%1
3.4%
10.3%
4/30/98
1111.75
-2.77
-1.2%
20.1%
22.6%
30.6%
12/29/72
118.05
-2.77
-11.7% -17.4%
-27.1%
-41.9%
11/28/80
140.52
-2.69
-5.6% -10.1%
-20.4%
-1.4%
6/30/71
98.70
-2.62
3.4%
8.6%
19.6%
5.6%
5/29/98
1090.82
-2.58
6.7%
19.3%
27.3%
30.2%
5/31/71
99.63
-2.56
-5.7%
9.9%
17.1%
5.3%
12/31/80
135.76
-2.55
-3.4%
-9.7%
-19.3%
3.6%
7/31/00
1430.83
-2.55
-4.5% -15.3%
-21.0%
-36.3%
3/31/98
1101.75
-2.55
-7.7%
16.8%
16.4%
36.0%
11/30/99 1388.91
-2.51
2.3%
-5.3%
-9.6%
-18.0%
6/30/00
1454.60
-2.49
-9.2% -15.8%
-21.1%
-32.0%
5/31/72
109.53
-2.48
6.5%
-4.2%
-12.4%
-20.3%
4/28/00
1452.43
-2.46
-1.6% -14.0%
-27.0%
-25.9%
7/30/71
95.58
-2.44
8.7%
12.4%
21.4%
13.2%
6/30/98
1133.84
-2.42
8.4%
21.1%
29.6%
28.3%
9/30/76
105.24
-2.42
-6.5%
-8.3%
-15.2%
-2.6%
7/30/76
103.44
-2.38
-1.4%
-4.4%
-13.7%
-2.7%
10/29/76
102.90
-2.33
-4.3% -10.3%
-5.9%
-9.5%
6/30/76
104.28
-2.33
3.0%
-3.6%
-8.8%
-8.4%
3/31/81
136.00
-2.28
-14.6% -17.7%
-11.5%
12.5%
8/31/76
102.91
-2.26
-3.0%
-6.0%
-15.4%
0.4%
9/29/00
1436.51
-2.26
-19.2% -27.5%
-20.1%
-43.2%
Average Return
-3.0%
-6.1%
-5.5%
-1.8%
Number of Occurrences
36
36
36
36
Number of Positive Occurrences
11
8
13
18
Positive Occurrences as % of Total
31%
22%
36%
50%
Maximum Return
11.1%
21.1%
29.6%
36.0%
Minimum Return
-19.2% -27.5%
-31.6%
-43.2%

JOURNAL of Technical Analysis Summer-Fall 2004

All Occurrences with RAPAD Readings of +2.25 and Above (Extreme Cash Surplus)
10/29/93
467.83
2.25
-3.6%
1.0%
10.0%
24.3%
11/29/74
69.97
2.26
30.3%
30.4%
43.2%
45.9%
11/28/86
249.22
2.27
16.4%
-7.6%
5.2%
9.8%
1/31/94
481.61
2.28
-4.8%
-2.3%
16.7%
32.1%
5/30/80
111.24
2.29
26.3%
19.2%
13.6%
0.6%
6/30/88
273.50
2.29
1.5%
16.3%
29.2%
30.9%
6/30/58
45.24
2.31
22.0%
29.2%
32.4%
25.8%
4/30/86
235.52
2.32
3.6%
22.4%
6.9%
11.0%
6/30/93
450.53
2.35
3.5%
-1.4%
1.9%
20.9%
7/31/86
236.12
2.37
16.1%
35.0%
8.9%
15.2%
6/29/90
358.02
2.40
-7.8%
3.7%
16.5%
14.0%
7/31/70
78.05
2.41
22.8%
22.5%
33.2%
37.6%
8/31/88
261.52
2.42
10.5%
34.4%
26.9%
23.3%
7/31/90
356.15
2.43
-3.4%
8.9%
14.8%
19.1%
5/31/93
450.19
2.44
2.6%
1.4%
0.8%
18.5%
6/30/92
408.14
2.48
6.8%
10.4%
14.3%
8.9%
8/29/86
252.93
2.48
12.4%
30.4%
5.9%
3.4%
10/30/92
418.68
2.48
5.1%
11.7%
7.7%
12.8%
11/30/92
431.35
2.53
4.4%
7.1%
5.8%
5.2%
9/30/88
271.91
2.54
8.4%
28.4%
25.0%
12.6%
10/31/66
80.20
2.55
17.2%
16.3%
21.5%
28.9%
7/30/93
448.13
2.55
7.5%
2.3%
5.0%
25.4%
2/29/88
267.82
2.56
-2.4%
7.9%
31.2%
23.9%
10/31/86
243.98
2.57
18.2%
3.2%
7.1%
14.3%
10/31/85
189.82
2.57
24.1%
28.5%
51.9%
32.6%
2/28/94
467.14
2.58
1.8%
4.3%
20.3%
37.1%
7/29/88
272.02
2.59
9.4%
27.2%
21.0%
30.9%
9/30/92
417.80
2.61
8.1%
9.8%
6.7%
10.7%
7/31/92
424.21
2.67
3.4%
5.6%
13.5%
8.0%
5/31/88
262.16
2.75
4.4%
22.3%
32.0%
37.8%
11/30/89
345.99
2.78
4.4%
-6.9%
12.7%
8.4%
2/26/93
443.38
2.79
4.6%
5.4%
7.2%
9.9%
1/29/88
257.07
2.85
5.8%
15.7%
34.6%
28.0%
5/31/90
361.23
2.90
-10.8%
7.9%
3.9%
15.0%
3/31/88
258.89
2.91
5.0%
13.9%
34.9%
31.3%
9/30/86
231.32
2.97
26.1%
39.1%
11.9%
17.5%
8/31/92
414.03
2.98
7.1%
12.0%
12.8%
14.8%
4/30/87
288.36
3.00
-12.7% -9.4%
-3.3%
7.4%
1/31/90
329.08
3.10
8.2%
4.5%
17.8%
24.2%
3/31/93
451.67
3.11
1.6%
-1.3%
2.4%
10.9%
2/28/90
331.89
3.19
-2.8%
10.6%
19.1%
24.3%
10/30/87
251.79
3.23
3.8%
10.8%
23.0%
35.2%
4/29/88
261.33
3.28
6.8%
18.5%
30.2%
26.6%
3/30/90
339.94
3.47
-10.0% 10.4%
14.1%
18.8%
4/30/93
440.19
3.51
6.3%
2.4%
7.3%
16.9%
8/31/90
322.56
3.67
13.8%
22.6%
27.9%
28.4%
12/31/90
330.22
3.75
12.4%
26.3%
23.6%
31.9%
9/30/74
63.54
3.99
31.2%
32.0%
61.7%
65.6%
11/30/87
230.30
4.06
13.8%
18.8%
39.2%
50.2%
4/30/90
330.80
4.08
-8.1%
13.5%
18.6%
25.4%
11/30/90
322.22
4.36
21.0%
16.4%
28.9%
33.9%
9/28/90
306.05
4.60
22.6%
26.7%
31.9%
36.5%
10/31/90
304.00
4.81
23.5%
29.1%
36.5%
37.7%

Average Return
Number of Occurrences
Number of Positive Occurrences
Positive Occurrences as % of Total
Maximum Return
Minimum Return

8.3%
53
43
81%
31.2%
-12.7%

14.1%
53
47
89%
39.1%
-9.4%

19.4%
53
52
98%
61.7%
-3.3%

23.0%
53
53
100%
65.6%
0.6%

The figures below give the average of all data during the study period
Average Return
4.1%
8.1%
12.2%
16.6%
Number of Occurrences
585
579
573
567
Number of Positive Occurrences
386
403
419
448
Positive Occurrences as % of Total
66%
70%
73%
79%

Appendix B
Trade signals using the extremes in the RAPAD mutual fund cash level
as entries and exits.

Biography
Jason Goepfert is the President and CEO of Sundial Capital Research,
Inc., a firm focused on the research and practical application of mass
psychology to the financial markets. Prior to founding Sundial, Jason
managed the operations of a large discount brokerage firm and a multibillion dollar hedge fund, experience which firmly planted the idea that
logic rarely trumped emotion when it came to traders investment decisions.
Sundial trades proprietary capital and releases its research to institutional
clients and individual investors via its web site, www.sentimenTrader.com

JOURNAL of Technical Analysis Summer-Fall 2004

17

18

JOURNAL of Technical Analysis Summer-Fall 2004

Introducing the Volume Price Confirmation Indicator (VPCI):


Price & Volume Reconciled
Buff Dormeier, CMT
This paper introduces a new volume-price measurement tool that could provide the clearest picture of the volume-price relationship of any indicator devised: the Volume-Price Confirmation Indicator, or VPCI. The VPCI reveals
the intrinsic relationships between price and volume as a validation or contradiction of price trends. In other words, VPCI identifies the inherent relationship between price and volume as harmonious or inharmonious. This study
shows that investors who use the VPCI properly may increase their profits and
the reliability of their trades, while simultaneously reducing risk.
In the exchange markets, price results from an agreement between buyers
and sellers to exchange, despite their different appraisal of the exchanged items
value. One opinion may be heavily loaded in meaning and purpose; the other
may be pure nonsense. However, both are equal as far as the market is concerned. Price represents the convictions, emotions and volition of investors. I t
is not a constant, but rather is changed and influenced by information, opinions
and emotions over time. Market volume represents the number of shares traded
over a given period. It is a measurement of participation, enthusiasm or interest in a given security. Price and volume are closely linked, yet are independent
variables. Together, these individually derived variables give better indications of supply and demand than either can provide independently.
Volume can be thought of as the force that drives the market. Force or volume is defined as power made operative against support or resistance.i In physics, force is a vector quantity that tends to produce an acceleration.ii The same
is true of market volume. Volume substantiates and mediates price. When volume increases, it confirms price; when volume decreases, it contradicts price
movements. In theory, increases in volume should precede significant price
movements, giving quicker downside and upside signals. This basic tenet of
technical analysis has been repeated as a mantra since the days of Charles Dow.iii
When stocks change hands, there is always an equal amount of buy volume
to sell volume on executed orders. When the price moves up, it reflects reasoned demand or the fact that buyers are in control. Likewise, when the price
moves down it infers supply or that sellers are in control. Over time, these
trends of supply and demand form accumulation and distribution patterns. VPCI
was designed to expose price and volume relationships as validation or contradiction of price trends. The following pages discuss the derivation and components of VPCI, explain how to use VPCI, review comprehensive testing of VPCI
and present further applications.

Deriving the Components


The market is likened to an orchestra without a conductor. By mediating
the intrinsic relationship between price and volume, the VPCI attunes price and
volume into an observable accord. Simply put, this could be considered the
harmony between price and volume. The basic concept is that measuring the
difference between volume-weighted moving averages (VWMAs) and the corresponding simple moving average (SMA), reveals a precise level of pricevolume confirmation or price-volume contradiction. This occurs because volume-weighted averages weight closing prices in exact proportion to the volume traded during each time period.
Since VWMAs are essential to understanding the VPCI, it is important to
differentiate them from SMAs. The VWMA was developed to give a more
accurate account of trends by modifying the SMA. The VWMA measures the
commitment expressed through a closing price, weighted by that days corresponding volume (participation), compared to the total volume (participation)
of the trading range. Although SMAs exhibit a stocks changing price levels,
they do not reflect the amount of participation by investors. However, with
VWMAs, price emphasis is directly proportional to each days volume, com-

pared to the average volume in the range of study.


The VWMA is calculated by weighting each timeframes closing price with
the timeframes volume compared to the total volume during the range:
volume-weighted average = sum {closing price (I) * [volume (I)/(total
range)]}
where I = given days action.
Here is an example of how to calculate
a two-day moving average, using both SMA SMA simple moving average
and VWMA on a security trading at $10.00 VWMA volume-weighted
with 100,000 shares on the first day and at
moving average
$12.00 with 300,000 shares on the second VPC (+/-) volume-price
day. The SMA calculation is Day Ones
confirmation/contradiction
price plus Day Twos price divided by the VPR volume-price ratio
number of days, or (10+12)/2, which equals
11. The VWMA calculation would be Day VM volume muliplier

Ones price (10) multiplied by Day Ones


volume of the total range expressed as a fraction (100,000/400,000 = 1/4) plus
Day Twos price (12) multiplied by Day Twos volume of the total range expressed as a fraction (300,000/400,000 = 3/4), which equals 11.5.
Keeping in mind how VWMAs work, an investigation of VPCI may begin.
The VPCI involves three simple calculations:
1.) volume-price confirmation/ contradiction (VPC+/-),
2.) volume-price ratio (VPR), and
3.) volume multiplier (VM).
The first step in calculating VPCI is to choose a long-term and short-term
timeframe. The long-term timeframe number will be used in computing the
VPC as the simple and volume-weighted price-moving average, and again in
calculating the VM as a simple, volume-moving average. The short-term
timeframe number will be used in computing the VPR as a simple and volumeweighted price-moving average and again in calculating the VM as a simple,
volume-moving average.
The VPC is calculated by subtracting a long-term SMA from the same
timeframes VWMA. In essence, this calculation is the otherwise unseen nexus
between volume proportionally weighted to price and price proportionally
weighted to volume. This difference, when positive, is the VPC+ (volumeprice confirmation) and, when negative, the VPC- (volume-price contradiction). In effect, this computation reveals price and volume symmetrically distributed over time. The result is quite revealing. For example, a 50-day SMA
is 48.5, whereas the 50-day VWMA is 50. The difference of 1.5 represents
price-volume confirmation. If the calculation were negative, then it would represent price-volume contradiction. This alone provides purely unadorned information about the intrinsic relationship between price and volume.
The next step is to calculate the volume price ratio. VPR accentuates the
VPC+/- relative to the short-term price-volume relationship. The VPR is calculated by dividing the short-term VWMA by the short-term SMA. For example, assume the short-term timeframe is 10 days, and the 10-day VWMA is
25, while the 10-day SMA is 20. The VPR would equal 25/20, or 1.25. This
factor will be multiplied by the VPC (+/-) calculated in the first step. Volumeprice ratios greater than 1 increase the weight of the VPC+/-. Volume-price
ratios below 1 decrease the weight of the VPC+/-.
The third and final step is to calculate the volume multiplier. The VM objective is to overweight the VPCI when volume is increasing and underweight
the VPCI when volume is decreasing. This is done by dividing the short-term
volume average by the long-term volume average. As an illustration, assume

JOURNAL of Technical Analysis Summer-Fall 2004

19

the short-term average volume for 10 days is 1.5 million shares a day, and the
long-term volume average for 50 days is 750,000 shares per day. The VM is 2
(1,500,000/750,000). This calculation is then multiplied by the VPC+/- after it
has been multiplied by the VPR.
Now we have all the information necessary to calculate the VPCI. The
VPC+ confirmation of +1.5 is multiplied by the VPR of 1.25, giving 1.875.
Then 1.875 is multiplied by the VM of 2, giving a VPCI of 3.75. Although this
number is indicative of an issue under very strong volume-price confirmation,
this information serves best relative to the current and prior price trend and
relative to recent VPCI levels. Discussed next is how best to use the VPCI.

Figure 1. Bullish Confirmation: SIRIs Rising Price Trend


and Rising VPCI. Bottom Red, Wiggly Line is VPCI.
Smoother Blue Line is VPCI Smoothed

Using VPCI
Confirming Signals

Unlike other volume-price indicators, the VPCI is not a stand-alone tool.


Most volume-price indicators may give signals without regard to price trend
(although this is not advised). For example, a trader may buy an issue based on
a breakout of On Balance Volume, or sell an issue on a Money Flow Index
divergence in an overbought zone. However, the VPCI gives virtually no indications outside of its relationship to price; it only confirms or contradicts the
price trend. There are several ways to use VPCI in conjunction with price trends
and price indicators. These include a VPCI greater than zero, a rising or falling
VPCI, a smoothed (moving average) rising or falling VPCI, or a VPCI as a
multiplier. Table 1 gives the basic VPCI utilizations:
Table 1. VPCI and Price Trends
Price
Rising
Rising
Declining
Declining

VPCI
Rising
Declining
Rising
Declining

Price-Trend
Relationship
Confirmation
Contradiction
Confirmation
Contradiction

Implications
Bullish
Bearish
Bearish
Bullish

VPCI in Action

In our first example (Figure 1), the price trend of SIRI is rising and the
VPCI is also rising. Here the VPCI is giving three bullish signals, the most
important being that the VPCI is rising. Increasing volume and price confirmation demonstrate strengthening commitment to the existing price trend of demand. Secondly, VPCI smoothed is rising and the VPCI has crossed above it,
indicating momentum within the confirmation. This is a good indication that
the existing bullish price trend will continue. Last and least important, both the
VPCI and VPCI smoothed are above the zero line, indicating a healthy longerterm accumulation. All of these VPCI indications are interpreted as bullish only
because SIRIs prevailing trend is rising.

Next, we look at an example of the VPCI giving a bearish contradiction


signal (Figure 2). TASRs stock price is rising, but the VPCI is falling. This
situation suggests caution - a significant price correction could be looming because the intrinsic relationship between price and volume is not harmonious.
Although price is rising and volume appears supportive, the VPCI is indicating
that demand is no longer in control. Here two bearish signs are given in the
presence of a rising stock price. Most significantly, both the VPCI and VPCI
smoothed are in downtrends, indicating weakening commitment to the uptrend.
Also, both the VPCI and VPCI smoothed are below zero, suggesting an unhealthy uptrend.
Figure 2. Bearish Contradiction: TASRs Rising Price
Trend and Falling VPCI

A falling stock price and a rising VPCI (Figure 3) is an example of volumeprice confirmation. In our illustration, GSKs stock price is falling and the VPCI
is rising, indicating control is clearly in the hands of sellers. The VPCI moves
gradually upward, supporting the downward price movement. Gaining momentum, the VPCI crosses above zero and eventually through the VPCI smoothed.
GSKs stock price breaks down shortly afterwards on the selling pressure.

20

JOURNAL of Technical Analysis Summer-Fall 2004

Figure 3. Bearish Confirmation: GSKs Falling Price


Trend and Rising VPCI

RIMM provides an example of a bullish contradiction. In Figure 4, RIMMs


price is declining, as is the VPCI. A decreasing VPCI while the price is falling
is usually a sign of increasing demand, especially if the stock has previously
been in an uptrend as was the case with RIMM. When RIMM begins to break
down, the VPCI takes a sharp nosedive, indicating a weak selloff. Once the
VPCI bottoms, the bulls regain control of RIMM and the breakdown is reversed. The VPCI turns upward, confirming the prior uptrend. This is a classic
example of the VPCI indicating a countertrend.
Figure 4. Bullish Contradiction: RIMMs Falling Price
and a Falling VPCI

Putting it all together, let us take a look at one final example of the VPCI in
action (Figure 5). Its extremely important to note when using VPCI that volume leads or precedes price action. Unlike most indicators, the VPCI will
often give indications before price trends are clear. Thus, when a VPCI signal
is given in an unclear price trend, it is best to wait until one is evident. This
final example is given in a weekly timeframe to illustrate VPCI signals in a
longer-term cycle.
At Point 1 in Figure 5, CMX is breaking out and the VPCI confirms this
breakout as it rapidly rises, crossing over the VPCI smoothed and zero. This is
an example of a VPCI bullish confirmation. Later, the VPCI begins to fall
during the uptrend, suggesting a pause within the new uptrend. This small
movement is a bearish contradiction. At Point 2, CMXs price falls as the
VPCI continues to fall below zero and eventually through the VPCI smoothed,
gaining momentum. This is a classic example of a countertrend VPCI bullish
contradiction. At Point 3, the VPCI has bottomed out and with CMX begins to

rise, confirming the last VPCI signal. Later, in Point 3, VPCI moves upward,
supporting the higher price movement. By Point 4, CMX breaks through resistance, while the VPCI upward momentum accelerates rapidly, crossing the VPCI
smoothed and zero. From this bullish confirmation, one could deduce a high
probability of a price breakout, illustrating bullish confirmation once again.
Figure 5. VPCI in action CMX

Testing the VPCI

Applying the VPCI information to a trading system should improve profitability. To evaluate this VPCI hypothesis, it was tested via a trading system,
contrasting two moving average systems. The goal of this study was not to
achieve optimum profitability but to compare a system using VPCI signals to
that of a system not using them. The crossing of the five-day and 20-day moving averages was used to generate buy and sell signals. The five-day moving
average represents the cost basis of traders in a one-week timeframe. The 20day moving average represents the cost basis of traders in a one-month
timeframe. The shorter moving average is more responsive to current price
action and trend changes, because it emphasizes more recent price changes.
The longer-term moving average comprises more information and is more indicative of the longer-term trend. Because its scope is broader, the longer-term
moving average normally lags behind the action of the shorter moving average.
When a moving average curls upward, the investors within this timeframe are
experiencing positive momentum. The opposite is true when the moving average curls downward. When the short-term moving averages momentum is significant enough to cross over the longer-term moving average, this is an indication of a rising trend, otherwise known as a buy signal. Likewise, when the
shorter-term moving averages momentum crosses under the longer-term moving average, a sell signal is generated.
Back-tested first was a five- and 20-day crossover system. A long position
is taken when the short-term moving average crosses above the long-term moving average. A short position is taken when the short-term moving average
crosses under the long-term moving average. These actions tend to represent
short-term changes in momentum and trend. In the comparative study, I used
the same five- and 20-day crossover, but kept only the trades when the VPCI
had crossed over a smoothed VPCI. This indicates a rising VPCI or price confirmation. The VPCI settings will be the same as the moving averages, 20 days
for the long-term component and five days for the short-term component. The
VPCI smoothed is the 10-day average of the VPCI.
There are a number of limitations to a study framed this way but these
settings were chosen deliberately to keep the study simple and uncompromised.
First, the five- and 20-day moving average settings are too short to indicate a
strong trend. This detracts from the effectiveness of the VPCI as an indicator
of price trend confirmation or contradiction. However, although these settings
are short, they provide more trades than a longer-term trend system, creating a
more significant sample size. Also, the VPCI settings at five and 20 days,

JOURNAL of Technical Analysis Summer-Fall 2004

21

when the price data is only 20 days old (length of the long-term moving average) are too short. By using these time settings, the VPCI may give indications
ahead of the price trend or momentum signals given by the moving average.
However, changing the settings could be interpreted as being optimized. Accordingly, a 10-day lookback delay on the VPCI and a five-day lookback delay
on the VPCI smooth was installed. This delay gives the VPCI confirmation
signal more synchronicity with the lagging moving average crossover. Ideally,
VPCI delays should be tuned in to the individual issue. My testing has shown
more responsive high volume and high-volatility issues generally do not require delays as long as slower-moving low volume and low-volatility issues.
One could also use trend lines corresponding to the timeframe being applied to
tune the VPCI.
To ensure a broad scope within the sample being studied, the test was broken into several elements. Securities were selected across three areas of capitalization: small, as measured by the S&P Small Cap Index; medium, as measured by the S&P 400 Mid Cap Index; and large, as measured by the S&P 100
Large Cap Index. Equally important are the trading characteristics of each
security. Thus, securities were further characterized by volume and volatility.
Combining these seven traits forms a total of 12 groups: small cap high volume, small cap low volume, small cap high volatility, small cap low-volatility,
mid cap high volume, mid cap low volume, mid cap high volatility, mid cap
low volatility, large cap high volume, large cap low volume, large cap high
volatility and large cap low volatility (Table 2).
Table 2. Sixty Securities Organized by Size, Volume and
Volatility
Large Cap
Low Volatility
PG
SO
BUD
WFC
PEP

Large Cap
High Volatility
EP
AES
DAL
ATI
NSM

Large Cap
High Volume
CSCO
MSFT
INTC
ORCL
GE

Large Cap
Low Volume
ATI
HET
BDK
GD
CPB

Mid Cap
Low Volatility
MDU
ATG
WPS
HE
NFG

Mid Cap
High Volatility
ESI
LTXX
NDN
WIND
SEPR

Mid Cap
High Volume
ATML
SNDK
COMS
MLMN
CY

Mid Cap
Low Volume
WPO
BDG
TECUA
KELYA
CRS

Small Cap
Low Volatility
UNS
UBSI
CIMA
CTCO
ATO

Small Cap
High Volatility
BRKT
ZIXI
MZ
LENS
CRY

Small Cap
High Volume
CYBX
MOGN
KLIC
HLIT
YELL

Small Cap
Low Volume
NPK
GMP
SXI
SKY
LAWS

To ensure unbiased results, five securities were back-tested in each of these


12 subgroups for a total of 60 securities, a significant sample size. For credibility, the five securities representing each group were not selected at random,
but by identifying the leaders in the various characteristics being measured.
Thus, the five highest- and lowest-volume securities, as well as the five highest- and lowest-volatility securities of each of the three capitalization groups as
identified by Bloomberg (June 22, 2004) were used in the study. Any duplicated securities (high-volume and high-beta stocks were occasional duplicates)
were used only once. Securities that lacked sufficient history were removed
and replaced by the next-best suitable issue.
To keep the system objective, both long- and short-system generated trades
were taken into account. A $10,000 position was taken with each crossover.

22

Commissions were not included. The testing period used was August 15, 1996,
to June 22, 2004, for a total of 2,000 trading days. The results were measured
in terms of profitability, reliability and risk-adjusted return.
Profitability

Profitability was tested using a five- and 20-day moving average crossover
and then retested using only those trades also displaying VPCI confirmation
signals. The results were impressive (Figure 6). Broadly, the VPCI improved
profitability in the three size classes - small, mid, and large caps - and all four
style classifications - high and low volume, and high and low volatility. Nine of
the 12 subgroups showed improvement. The exceptions were mid cap highvolatility issues, and small and large low-volume issues. Of the 60 issues tested,
39 or 65%, showed improved results using VPCI. The VPCI group made
$381,089. This compares to the competing non-VPCI group making only
$169,092. Thus, overall profitability was boosted by $211,997 with VPCI.
Figure 6. Profitability Improvement with VPCI

Reliability

Reliability was measured by looking at the percentage of profitable trades.


By employing VPCI in the five-/20-day crossover system, overall profitability
improved an average of 3.21% per issue. Improvement was realized by adding
VPCI in all three size groups and all four style groups (Figure 7). Of the 12
subgroups, 10 showed improved profitability with the VPCI. Large and small
cap low-volatility issues were the two exceptions. Overall, over 71% (43 of 60
issues) showed improvement with the VPCI.
Figure 7. Average Trading System Reliability

Risk-Adjusted Returns

Two tests of risk-adjusted performance were conducted to further evaluate


VPCI. One was the Sharpe Ratio, which takes the total return subtracted from
the risk-free rate of return (US Treasury Note) and divides the result by the
portfolios monthly standard deviation, giving a risk-adjusted rate of return.
VPCI improved the results once again across all three size categories and all
four style groups. VPCI realized improvement in nine of the 12 subgroups. Mid
cap high volatility, large cap low volatility, and large cap low volume were the
exceptions. Overall, the Sharpe Ratio showed significant improvement with
the addition of the VPCI.
A second way to look at risk-adjusted returns is through profit factor (Figure 8). Profit factor takes into account how much money could be gained for
every dollar lost within the same strategy, measuring risk by comparing the
upside to the downside. It is calculated by dividing gross profits by gross losses.

JOURNAL of Technical Analysis Summer-Fall 2004

For instance, one issue may generate $40,000 in losses and $50,000 in gains
whereas a second issue may generate $10,000 in losses and $20,000 in gains.
Both issues generate a $10,000 net profit. However, an investor could expect to
make $1.25 for every dollar lost in the first system, while expecting to make $2
for every dollar lost in the second system. The figures of $1.25 and $2 represent the profit factor. Even more significant improvements across all size, volume and volatility groups again were achieved using the VPCI. Of the 12 subgroups, only large cap low-volatility issues did not show an improvement with
the VPCI. Overall, the profit factor was improved by 19%, meaning one could
expect to earn 19% more profit for every dollar lost when applying VPCI to the
trading system.
Figure 8. Profit Factor Improvement Using VPCI
Among the 12 Subgroups

Conclusion
The VPCI reconciles volume and price as determined by each of their proportional weights. This information can be used to confirm or deny the likelihood of a current price trend continuing. This study clearly demonstrates that
adding the VPCI indicator to a trend-following system results in consistently
improved performance across all major areas measured by the study. As a
maestros baton in the hands of a proficient investor, the Volume Price Confirmation Indicator is a tool capable of substantially accelerating profits, reducing
risk and empowering the investor to more reliable investment decisions.

Footnotes
i

Ammer, C. (1997). The American Heritage Dictionary of Idioms. Boston:


Houghton Mifflin Company.
ii The American Heritage Stedmans Medical Dictionary. (2002). Boston:
Houghton Mifflin Company.
iii Edwards, R.D., & Magee, J. (1992). Technical Analysis of Stock Trends.
Boston: John Magee Inc.

BIOGRAGPHY

Other Applications
The raw VPCI calculation may be used as a multiplier or divider in conjunction with other indicators such as moving averages, momentum indicators,
or price and volume data. For example, if an investor has a trailing stop loss
order set at the five-week moving average of the lows, one could divide the
stop price by the VPCI calculation. This would lower the price stop when price
and volume are in confirmation, which would increase the probability of keeping an issue demonstrating price volume confirmation. However, when price
and volume are in contradiction, dividing the stop loss by the VPCI would raise
the stop price, preserving capital. Similarly, using VPCI as an add-on to various other price, volume, and momentum indicators may not only improve reliability but increase responsiveness as well.

Buff Dormeier, CMT began in the securities industry in 1993 with


PaineWebber. From PaineWebber Buff joined Charles Schwab where he
handled some of the firms largest and most active accounts. Do to his
growing popularity with the firms clientele; Buff was called to train other
brokers in the art of communicating technical market analytics to customers.
His training program gave birth to Schwabs Technical Analysis Team. Later,
Buff became the lead portfolio manager and chief technical analyst at T.P.
Donovan Investments. Armed with proprietary indicators and investment
programs, Buff now coaches and manages portfolios for individual and
institutional clients as a financial advisor at a major international brokerage
firm.
Buffs work with market indicators and trading system design has been
both published and referenced in Stocks & Commodities and Active Trader
magazines & has been discussed in seminars across the nation. Further,
Buffs original contributions to the field have been included in John
Bollingers book, Bollinger on Bollinger Bands. Buff is Series 8, 7, 65,
63 and insurance licensed and has previously served on the Market
Technicians Association Admissions Committee. Personal hobbies include
running and bible study.

JOURNAL of Technical Analysis Summer-Fall 2004

23

Notes
ates

noates

noates

noatesnoates

noates

noates

noates

noates

noates

noatesnoates

noates

noates

24

JOURNAL of Technical Analysis Summer-Fall 2004

You might also like