journal of technical analysis (jota). issue 62 (2004, summer—fall)

25
J OURNAL Analysis of Technical Market Technicians Association, Inc. A Not-For-Profit Professional Organization Incorporated 1973 SM Summer-Fall 2004 Issue 62

Upload: beniamin-paylevanyan

Post on 29-Jul-2015

109 views

Category:

Economy & Finance


6 download

TRANSCRIPT

Page 1: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL

Analysisof

Technical

Market Technicians Association, Inc. A Not-For-Profit Professional Organization ■ Incorporated 1973

SM

Summer-Fall 2004

Issue 62

Page 2: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

1JOURNAL of Technical Analysis • Summer-Fall 2004

Journal Editor & Reviewers 3

The Organization of the Market Technicians Association, Inc. 4

Behavioral Finance and Technical AnalysisInterpreting Data From an Experiment on Irrational Exuberance, Part B: 5

Reflections from Three Different Angles

Henry O. Pruden, Ph.D.; Dr. Bernard Paranque; Dr. Walter Baets

2004 Charles H. Dow Award WinnerMutual Fund Cash Reserves, the Risk-Free Rate and Stock Market Performance 12

Jason Goepfert

Introducing the Volume Price Confirmation Indicator (VPCI): 18

Price & Volume Reconciled

Buff Dormeier, CMT

JOURNAL of Technical Analysis ● Summer-Fall 2004 ● Issue 62

Table of ContentsThis issue of the Journal of Technical Analysis includes the most recent Charles H. Dow Award winner, Jason Goepfert.

His study is an update on work that has been done before by others but is extremely detailed and comprehensive, and adds

to our knowledge of how mutual fund cash positions reflect investment management opinion. Our old friend, and prior

editor of the Journal, Henry Pruden with two French professors, Bernard Paranque and Walter Baets, continue in their

discussion of a possible explanation for the connection between behavioral finance and stock market behavior. As always,

they introduce many new ideas to think about. And finally, we have an article by Buff Dormeier in which he devises a new

way to look at volume and price action together that appears to have some predictive ability.Charles D. Kirkpatrick II, CMT, Editor

1

2

3

Page 3: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 20042

Page 4: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

3JOURNAL of Technical Analysis • Summer-Fall 2004

Editor

Charles D. Kirkpatrick II, CMTKirkpatrick & Company, Inc.

Bayfield, Colorado

Associate Editor

Michael Carr, CMTCheyenne, Wyoming

JOURNAL of Technical Analysis is published by the Market Technicians Association, Inc., (MTA) 74 Main Street, 3rd Floor, Woodbridge, NJ 07095. Its purposeis to promote the investigation and analysis of the price and volume activities of the world’s financial markets. JOURNAL of Technical Analysis is distributed toindividuals (both academic and practitioner) and libraries in the United States, Canada and several other countries in Europe and Asia. JOURNAL of TechnicalAnalysis is copyrighted by the Market Technicians Association and registered with the Library of Congress. All rights are reserved.

Production Coordinator

Barbara I. GompertsManager, Marketing Services, MTA

Marblehead, Massachusetts

Publisher

Market Technicians Association, Inc.74 Main Street, 3rd Floor

Woodbridge, New Jersey 07095

Journal Editor & Reviewers

Connie Brown, CMTAerodynamic Investments Inc.

Pawley’s Island, South Carolina

Julie Dahlquist, Ph.D.University of Texas

San Antonio, Texas

J. Ronald Davis, CMTGolum Investors, Inc.

Portland, Oregon

Cynthia Kase, CMTKase and Company

Albuquerque, New Mexico

Michael J. Moody, CMTDorsey, Wright & Associates

Pasadena, California

Kenneth G. Tower, CMTCyberTrader, Inc.

Princeton, New Jersey

Avner Wolf, Ph.D.Bernard M. Baruch College of the

City University of New York

New York, New York

Manuscript Reviewers

Page 5: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 20044

Member and Affiliate Information

MTA MEMBER

Member category is available to those “whose professional efforts are spentpracticing financial technical analysis that is either made available to the in-vesting public or becomes a primary input into an active portfolio managementprocess or for whom technical analysis is a primary basis of their investmentdecision-making process.” Applicants for Membership must be engaged in theabove capacity for five years and must be sponsored by three MTA Membersfamiliar with the applicant’s work.

MTA AFFILIATE

MTA Affiliate status is available to individuals who are interested in tech-nical analysis and the benefits of the MTA listed below. Most importantly,Affiliates are included in the vast network of MTA Members and Affiliatesacross the nation and the world providing you with common ground amongfellow technicians.

DUES

Dues for Members and Affiliates are $300 per year and are payable whenjoining the MTA and annually on July 1st. College students may join at areduced rate of $50 with the endorsement of a professor. Applicants for Mem-ber status will be charged a one-time application fee of $25.

Members and Affiliates

■ have access to the Placement Committee (career placement)■ can register for the CMT Program■ may attend regional and national meetings with featured speakers■ receive a reduced rate for the annual seminar■ receive the monthly newsletter, Technically Speaking

■ receive the Journal of Technical Analysis, bi-annually■ have access to the MTA website and their own personal page■ have access to the MTA lending library■ become a Colleague of the International Federation of Technical Analysts

(IFTA)

Journal Submission GuidelinesWe want your article to be published and to be read. In the latter regard, we

ask for active simple rather than passive sentences, minimal syllables per word,and brevity. Charts and graphs must be cited in the text, clearly marked, andlimited in number. All equations should be explained in simple English, andintroductions and summaries should be concise and informative.1. Authors should submit, with a cover letter, their manuscript and supporting

material on a 1.44mb diskette or through email. The cover letter shouldinclude the authors’ names, addresses, telephone numbers, email addresses,the article title, format of the manuscript and charts, and a brief descriptionof the files submitted. We prefer Word for documents and *.jpg for charts,graphs or illustrations.

2. As well as the manuscript, references, endnotes, tables, charts, figures, orillustrations, each in separate files on the diskette, we request that theauthorsÕ submit a non-technical abstract of the paper as well as a shortbiography of each author, including educational background and specialdesignations such as Ph.D., CFA or CMT.

3. References should be limited to works cited in the text and should followthe format standard to the Journal of Finance.

4. Upon acceptance of the article, to conform to the above style conventions,we maintain the right to make revisions or to return the manuscript to theauthor for revisions.

Please submit your non-CMT paper to:Charles D. Kirkpatrick II, CMT7669 CR 502Bayfield, CO [email protected]

The Organization of theMarket Technicians Association, Inc.

Page 6: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

5JOURNAL of Technical Analysis • Summer-Fall 2004

In Part A of “Interpreting the Findings of An Experiment on Irrational Exu-berance...” the authors organized their analyses around a “positive theory” ofbehavioral finance and the “nominal theory” of technical market analysis rules.The behavioral finance model for structuring the data of the experiment wasthe Cusp Catastrophe Model of non-linear behavior. The nominal model, basedupon the data exposed by the positive model, was a group of four technicalmarket analysis principles and one mental discipline/trading strategy disci-pline.

This article, the Part B article of the series, seeks to extend the interpreta-tions of the findings. But rather than the single tightly structured theme of PartA, this time the Part B calls upon the varied talents of all three co-authors.This article takes advantage of the international, cross-cultural and multipledisciplines represented by the three authors. Hence, each of the three co-au-thors was asked to analyze and re-interpret the experimental evidence from hisparticular professional discipline. Thus, the three co-authors reflect interpre-tations from three different angles.

The first subsection of this article by Walter Baets reflects his area of disci-pline, which is complexity and knowledge management. Dr. Baets reflectedupon the behavior that gave rise to the price behavior generated by the experi-ment. Dr. Baets provides a broad perspective upon behavioral finance notionsand he gives a penetrating look into the structure behind the actions of thestudent traders in the Cal Tech Experiment. Dr. Baets considers SWARM-liketheories to explain the emergent behavior generated by the individualistic, self-serving behavior of interacting agents, the students in the experiment.

The second sub-section by Professor Bernard Paranque reflects his forma-tion as a doctor of economics and his responsibilities as Head Finance andInformation Department. Dr. Paranque sets forth reflections upon the experi-ment described in Part A that touch upon the sensitive and vital but oftenunexamined issue of risk and welfare and for all market participants. Thisviewpoint stands opposite to the self-serving behavior of the few elite traderswho could have exploited the cusp and profited from the decline using techni-cal analysis tools. In other words, Dr. Paranque takes up the challenge ofexamining the ethical dimension of behavioral finance and technical marketanalysis. His point of departure is the “greater fool” theory operating duringthe experiment.

Pruden takes a pragmatic yet artistic approach to the extraction of moreinformation from the technical analysis rules that were used to interpret thelaboratory data. And that could have been used by the astute, elite trader toexit the market in advance of the crash in prices.

The third section by Pruden reflects upon the four technical rules of prin-ciples that were applied in the Part A article. Pruden in Part B seeks to extendthe analytical capacity each of the four technical rules or principles by carry-ing them into the realm of Sequential Art. These were the tools that could havebeen employed by the astute, elite trader to identify the cusp in time to avoidthe catastrophic crash. His goal is to offer ideas and techniques for extractingeven more information from the data found in the laboratory experiment.

Extensions stimulated by notions from Sequential Art 1may offer value addedto technical analysts in general.

Sub-Part One by Dr. Walter Baets

Agent Behavior and SWARM-Like Theory“There is no path; you lay down the path in walking” Machady, a Spanish Poet

In the part A article, we observed a clustered, rather linear and persistantbehaviour of actors/agents. In fact, an interesting observation is that the modelvisualizes the emergence of a certain kind of local stabilities (probably compa-rable to what is known as attractors in complexity theory), before somethinglike a (belief?) shift takes place moving the agents and the system into whatcould be called discontinuous behavior. The model indeed visualizes emer-gence of interacting agents, yet it does not allow us to gain insight in the mecha-nism of the construction of the phenomenon it describes. If we wish to takethis argument further and get a deeper understanding of both the marketbehaviour and specifically the role of the interacting agents, we should go deeperinto theories that are emergent in nature and simulate agent-based behavior.Commonly known are SWARM-like theories.

Behavioural Finance and Technical Analysis point out the coordination prob-lem of the agent’s action. This is commonly accepted, but the coordinationproblem of the agent’s behavior has been studied under a certain (widely ac-cepted) ontological and epistemological assumption of causality, i.e. that realityis based on a causal interaction between variables, independent of the emotionalaspects of human agents. The causal approach is based upon the assumption of alarger part of cognitive psychologists that still consider the mind as a processor(a computer) of information that is caught outside the person. It denies that theobserver creates his own reality while observing, and it denies the fact that mar-ket behaviour also includes the significant interaction of agents (and their re-spective behaviours). A consequence of this ontology is that only what can bemeasured could be managed, and broader, only what can be observed exists. It isthis rational, reductionist view on human behaviour that we often find in techni-cal analysis. Within this ontological and epistemological choice, causality makessense, and (knowledge) engineering approaches should be able to give answersto issues of market behaviour. Knowledge engineering techniques have beenextensively used in order to construct market analysis tools.

Keep in mind that we classically talk about emotions and psychology, butalways and only within the above described ontology. That seriously limits ourview and hence what we will eventually observe. In order to observe differ-ently, we have to investigate first the ontology behind our thinking. An alterna-tive ontology that increasingly gains attention is the one based on what neuro-biologists (Maturana and Varela) call an enacted and embodied view of cogni-tion. This ontology is one that is based on the acceptance that the observercreates himself the reality which he observes. There is no reality, but it iscreated as you observe. This concept would indeed allow us to explain whatwe might call non-rational behaviour of traders, for instance. Traders are notirrational, but they can only observe the reality which their experience (andtheir learning) allows them to observe. This ontology gives power to the indi-vidual agents interacting in a network that all together co-create (in a dynamicprocess) reality. In fact this ontology is an emergent one in which knowledgeand behaviour are continuously created via interaction and hence cannot beanticipated using top-down causal models. Indeed, this ontology is not basedon causal relationship, but rather on synchronicity (being-together-in-time). Wewill get back to that later. For the time being, and in order to understand the

1Behavioral Finance and Technical AnalysisInterpreting Data from an Experiment on Irrational Exuberance,

Part B: Reflections from Three Different Angles

Professors Walter Baets, Bernard Paranque and Henry Pruden,Euromed-Marseille École de Management

Page 7: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 20046

essence of agent’s behaviour, we make the choice that reality is created via theinteraction of individual agents that create emergent behaviour. Using the wordsof the famous Spanish poet Machado: “there is no path; you lay down the pathin walking.”

What do we understand by enacted and embodied cognition, within anautopoetic system? An autopoetic system is a concept out of neurobiology thatdescribes behaviour of any neurobiological colony, including hence humanbehaviour. An autopoetic system is one that organises and reproduces itself insuch a way that is ideal for survival. The human body is an excellent exampleof an autopoetic system. Cells in the body continuously reproduce to allow thebody to survive. Furthermore, the body is completely self-organised. Withinsuch a system we can identify a mind (say an individuals mind) that is embod-ied, which means that it is not just embrained (the computer metaphor) butliterally distributed through the body via the sensors (the human senses) incontinuous contact with its environment.

The environment co-creates the mind. Cognition that eventually will leadbehaviour is then enacted. Enaction has two dimensions: action and shaping.Therefore cognitive action always contains these two components: action andcreation. All the rest is information. We hit a common misunderstanding be-tween knowledge and information. Information is static (and linear) and there-fore can be copied and repeated, whereas knowledge is dynamic (and non-linear) and therefore needs to be created each time over and again. Complexitytheory (Nicolis and Prigogine) has proven that perspective to us over the last30 years. The enacted view on knowledge (and behaviour) allows us to exploremodels that have creative force and show emergent behaviour.

An often made assumption, that we presume is too limited, is that rational(human) behaviour could only be causal (based on the hidden ontological as-sumption described above). If it is causal one can write it down in equationsthat in turn would drive reality. If we really believe in behavioural theories,then let us take this to its finality: agent theory.

For clarity sake, we have already touched upon a few concepts of complex-ity theory (dynamic non-linear systems behaviour) that shed a completely dif-ferent light on market behaviour (Baets, 1998a and b). Systems are auto-organisational, based on an embodied mind and on enacted cognition. Systemsand knowledge are each time over and again re-created (which is by the waywhat our brain does, since it is the most efficient way of organisation). Realityis not Newtonian (fixed time-space concept) but emergent (co-created in inter-action). In my habilitation thesis I have called that “The quantum structure ofbusiness” (Baets, 2004). Complexity theory goes much further, but for thepurpose of our argument, we can leave it here.

An interesting development, based on this complexity theory, is what weknow as artificial-life research (Langton) and one of its further developments,i.e. agent based simulations (Holland). Agent based simulations is a develop-ment in artificial intelligence, that, different from what AI is unfortunately stillknown for, i.e. expert systems, in that it exposes learning behaviour. Indeedagent simulations are based on interaction of individual agents, that have indi-vidual qualities and purposes, and that agree upon a minimum set of interactionrules. Behaviour is clearly dynamic and produced in the continuous interactionof agents that exchange information with each other. The least one can say isthat this is very much like human behaviour, particularly in financial markets.Where as catastrophe theory implies a time dimension, agent-based simulationgives due importance to what Prigogine calls the “constructive role of time”.Each instance we bring in the arrow of time, let us say the constructive role ofinteraction, behaviour gets created; it literally emerges.

This view supposes a number of “interacting” agents, within a specific field(of action) having each their personal qualities and goals and following a mini-mum set of interaction and exchange rules. The question then becomes howsuch a complex system could come to a coherent state. Most suggestions go inthe same direction. Varela suggests resonance as the mechanism; Sheldrakesuggests morphogenetic fields: sense is made out of interaction in a non causalway. This mechanism of resonance is what occurs in “SWARM” like societies(Epstein and Axtell, 1996). In fact we are talking here of agent theories. In

agent theory, as already suggested we only have to identify the playing ground(let us say a particular financial market) and a number of agents. Each agent isautonomous in achieving his goal(s) and is of course gifted with qualities (likeexperience, information, human characteristics). Those agents interact witheach other based on a minimum number of interaction rules. Those rules gov-ern the behaviour in the simulation, but they also define the learning of thedifferent agents. Agents, then translating learning into (new) action, co-createin interaction with each other, continuously new (and adapted behaviour). In-deed, in such a market the “path is layed down in walking,” just as realityhappens to be in financial markets.

The argument takes Catastrophe theory one step further to its intrinsic ulti-mate claim, i.e. that time plays a constructive role in (market) behaviour. In ourown research (Baets 2004 and 2005) a number of projects are undertaken usingagent theories, but not yet in the financial markets. Agent theories have beensuccessfully used to visualise the emergence of innovation in a large consumergoods company. That was to visualize emergent market behaviour identifyingan adapted market introduction strategy; and also to study emergent states inconflict handling.

The basic question does lead us back to the ontological choices we dis-cussed earlier. Once we accept complexity theory as a promising paradigm wecannot avoid the question of causality. Quantum mechanics has given the worlda tremendous dilemma. How is it possible that two photons moving in differentdirections, still keep in instantaneous contact. As Pauli (Van Meijgaard), amongstothers, suggests, there should be indeed interaction in a non-local field. Thingsseem to occur “at the same time” without having any causal relationship. It isthis quantum structure of (financial markets) that deserve our attention in orderto improve our understanding of market behaviour (Baets, 2004).

References

■ Baets W, 1998 a, Organizational learning and knowledge technologies in adynamic environment, Kluwer Academic

■ Baets W, 1998 b, Guest editor of a special issue of Accounting Managementand Information Technologies, Complex adaptive systems, Elsevier

■ Baets W, 2004, Une interprétation quantique des processus organisationnelsd’innovation, thèse de HDR, IAE Aix-en-Provence

■ Baets W (ed), 2005, Knowledge Management: beyond the hypes, KluwerAcademic, fothcoming

■ Epstein J and Axtell R, 1996, Growing Artificial Societies, MIT Press■ Holland J, 1998, Emergence from chaos to order, Oxford University Press■ Langton C (Ed), 1989, Artificial Life, Santa Fe Institute Studies in the

Sciences of Complexity, Proceedings, Vol 6, Addison-Wesley■ Maturana H and Varela F, 1980, Autopoiesis and Cognition: the realization

of the living, Reidel■ Maturana H and Varela F, 1992, The tree of knowledge, Scherz Verlag■ Nicolis G and Prigogine I, 1989, Exploring complexity, Freeman■ Sheldrake R, 1995, The presence of the Past, Park Street Press■ Sheldrake and Bohm, 1982, Morphogenetic fields and the implicate order,

ReVision■ Van Meijgaard H, 2002, Wolfgang Pauli Centennial 1900-2000, PhD thesis

TU Twente

Sub-part Two by Dr. Bernard Paranque

Behaviorial Finance and Technical Analysis“Behavioral Finance and Technical Analysis” point out the coordination

problem of the agent’s actions. More precisely, the question is the effect of theaction of certain types of agents on the collective welfare and, by consequence,how to mitigate the negative consequences. There are three main answers: the

Page 8: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

7JOURNAL of Technical Analysis • Summer-Fall 2004

first is through the laws and other professional rules such as the SEC, Basle andso on; the second is the availability of tools allowing actors to avoid the prob-lems as the one in the article quoted above; the last is at the level of individualand his/her own capacity to take into account the collective interest or so calledsocial welfare. Neoclassical economic theory says that, under specific hypoth-esis, market, and particularly financial market, is the best way to ensure theright allocation of resources. But, since that hypothesis is never verified, weneed others tools to manage the market. We need to have tools to help us. Wewon’t speak about laws and regulations rules as Basle2 are prepared to impose.Rather, we will focus on individual behavior.

A lot of criticism point out the myopia of the agents, the mimetism of theirdecision, even these attitudes are the cause of the breakdown in the distributionof social welfare. When one has lost his confidence in the others or, morerelevant, one decides to change because he is able to influence the market (infact it is the only way to win: because the winner needs a lot of losers, thatmean a lot of followers). It could be possible to anticipate the breakdown if weare able to identify the the main proxies of these strategies, as was demon-strated in our article (Part A). I feel that our individual social responsibility isof true significance. This responsibility can’t be assumed without clear rules ofactions. I would like to take an example from the work of Jensen.

In an article published in October 2001, Jensen highlighted the operationallimitations of the prevailing interpretation/use made of value maximization andthe stakeholder theory. He engaged then a criticism of the central model ofentrepreneurship with the polar figure of the manager and the shareholder. Inaddition, he wanted to introduce other stakeholders of the firm.

Without it being explicitly stated, it seems that the different financial scan-dals may have a bearing on the desire to explain the operating conditions pro-posed by the maximization versus stakeholder theories, which are, in someways competing, and in other ways complementary.

On the one hand, it is argued that value maximization for the shareholder,with all the problems this type of monitoring entails, remains the best way toattain social welfare in a market economy. On the other hand, stakeholder theorystresses the need to take into account the interests of all of the stakeholders in afirm, including the customers, all of the suppliers, and the employees. Accordingto Jensen, the complementarities of the two theories stems from the need to un-derstand value maximization from a collective point of view: social welfare isonly achieved when, “all of the value” contributed by each of the stakeholders ismaximized, and when this maximization of value occurs over the long term. Theresult is that the firm is recognized as a historical and complex organization.

However, an operational problem arises if managers are expected to maxi-mize value thus defined in that there is no reason why the objectives of thevarious stakeholders should coincide. This criticism is valid both from thepoint of view of value maximization (how can several objectives be managedsimultaneously?) and that of stakeholder theory (how is a common objective tobe defined?).

In fact, if Jensen recognizes the relevance of the stakeholder theory, heunderlines a problem. This theory is not able to answer the question about howto manage several aims which could diverge. He says, before managing thefirm and maximizing its value and taking into account the wishes of the stake-holders, there is the need to obtain an agreement; on the one hand about thehierarchies of the aims, and on the other hand about the modalities of theiraccomplishment and the monitoring of the performances of the firm.2

The agreement is the core of the deal and of the future performance becauseit will determine the manager’s value maximization strategy, in particular inthe field of the organization of the firm. For the supporters of the stakeholdertheory there is a tool, the “balanced scorecard,” but, in accordance with Jensen,they say nothing about the necessity to obtain beforehand agreement on theobjectives from every participant involved in the firm and then, on the way, tobuild common rules to play by.

This concern, means the “social welfare” implies to deal with “des problèmesd’informations, d’anticipation et d’évaluation” (Salais and alii, 1986, p 193).In fact, at a collective level but also at an individual level, we need to agree on

a common “reality,” not only to build it but also to agree to act together in thisperspective: “L’enjeu de ces négociations est le modèle d’interprétation à retenirpour “construire la réalité” qui se présente à eux [les agents] comme problèmeà résoudre” (id, p 197-198). In others words, this necessary negotiation ex-presses a convention through which “l’accord des agents sur leur descriptiondu monde et leur [permettant] ainsi de coordonner leurs projets” (id, p236) isapproved. That kind of agreement “repose sur des processus sociauxd’élaboration de modèles de représentation de la réalité” (id, p239).

Then, the question is how to manage this agreement at a collective leveland at an individual level. We need to identify specific coordination principleson which we can obtain an agreement from the stakeholder and the availabilityof specific tools given the opportunity to manage the collective behavior byanticipating the risk of breakdown that’s meaning the behavior of the one whodoes not play with the same aim. But, it is not possible to negotiate this kind ofagreement without discussing the relevance of criteria of management and thesense of performance, and then the different meaning between the stakehold-ers. For example, from the workers point of view, the starting point must be thevalue added and not the EBITDA or the cash flow, because the value added isthe condition of their wages, despite the fact that the wages have an influenceon the profit.3

In total, “entreprendre avec efficacité suppose de maîtriser l’incertitude rela-tive aux marchés, aux technologies et aux produits futurs, la cohérence de sespropres projets par rapport à ceux des autres agents, partenaires ou concur-rents.” (id,p246).

Nevertheless, the main point is the coordination of the agents behaviourwhich deal with the uncertainty management.

“Dans un contexte de relations aux autres dont on ne peut faire abstrac-tion, l’incertitude tenant à la personne doit être comprise comme une incerti-tude communicationnelle. Cependant, cette désignation est elle-même ambiguë,car elle pourrait laisser penser que l’incertitude se résume à un problème decirculation de l’information, à une imperfection. Or une information ne peutcirculer que si elle a été au préalable élaborée dans un langage commun et quesi, par conséquent, elle peut s’ajuster de part et d’autre dans un dispositif quilui soit congruent (par exemple, la présence de codes identiques)” (Salais,Storper 1993, pp 76-78) .

(En anglais: “Wages regulation system (la forme salaire in French) main-tains workforce unaware of the work that has been achieved” (page 255 Salaiset alii) to the extent that the accomplished work is revealed through the pro-duced value once the intermediary consumptions have been paid, namely theadded value (see page 227 as well as the written work by Paul Boccara on thesubject at hand, 1985).

References

■ Boccara, P. (1985), Intervenir dans les gestions avec de nouveaux critères,Editions Sociales;

■ Jensen, M C. (2001), “Value Maximization, Stakeholder Theory, and theCorporate Objective Function” (October). Unfolding Stakeholder Thinking,eds. J. Andriof, et al, (Greenleaf Publishing, 2002). Also published in JACF,V. 14, N. 3, 2001, European Financial Management Review, N. 7, 2001 andin Breaking the Code of Change, M. Beer and N. Norhia, eds, HBS Press,2000. http://ssrn.com/abstract=220671

■ Paranque, B. (2004, “Toward an Agreement” (February). Euromed MarseilleEcole de Management Paper No. 11-2004. http://ssrn.com/abstract=501322

■ Storper M. and Salais R. (1997), Worlds of Production : the action frameworkof the economy, Harvard University Press, Londres;

■ Salais R. et Storper M. (1993), Les mondes de production, École des HautesÉtudes en Sciences Sociales, Paris;

■ Salais R., Baverez N., Reynaud B. (1986), L’invention du chômage, PUF,(1999, édition PUF Collection Quadrige).

Page 9: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 20048

Sub-part Three by Henry O Pruden, Ph.D.

Chart Analysis as Sequential ArtThe technical analysis interpretation of the data from the Cal Tech Experi-

ment on Irrational Exuberance found in the Pruden, Paranque and Baets article(Part A) invites critical reflection. The very nature of pattern recognition re-quires good judgment by the analyst in isolating and interpreting appropriateand significant portions of chart data. Technical tools such as trend lines areextremely useful in separating out portions of chart data for further analysis.

But as observed in our article, Part A, the technical patterns identified in-volved comprehensive and varying perspectives. That interpretation was anart form. Technical market analysis of charts can perhaps be enriched andmade more reliable through an understanding and application of the principlesof Sequential Art.4

In this section I propose to revisit four of the technical analysis rules orconcepts that were presented in Part A. These will be re-interpreted with theaid of principles and patterns adopted with modification from the notions ofSequential Art. The reader will thus be given an opportunity to reflect upon thevalue added made possible for his/her applications of chart analysis and patternrecognition.The four technical tools applied in Part A to be reviewed here in Part B are:1. Fear vs. Greed Juxtaposed2. Trading Range Channels Along Tops and Bottoms3. Descending Price Peaks4. Catastrophic Panics Causing Price Gaps

Graphics of each of these rules are contained in the Appendix to this Part Barticle.

1. FEAR VS. GREED JUXTAPOSED

In the original article (Part A) expressions of fear were observed as grow-ing and expressions of greed were observed as shrinking as the market pricebehavior neared the breakdown, the catastrophe jump point. One principle ofSequential Art is that of the interdependance between the sounds (musical notes)and the visual indications shown on the charts. The sound (words, musicalnotes) and the marks on the chart picture go hand in hand to convey the idea ofchanging market sentiment that neither could convey alone. Here it can be seenthe importance of how sounds, words, musical notes and pictorial indicatorssupport each others strengths. This gives rise to the suggestion that technicalanalysis ought to include the careful annotation of junctures where it can beseen that behavior is changing on a chart. Please notice how poignant the useof icons, in this case musical symbols for high and low offers and bids thatconvey the changing juxtaposition of fear vs. greed. The combination of soundand visual clues also suggest a superior means of conveying the distinct char-acteristics of a sentiment indicator; they are a fine way of communicating theemotional content of the information.

2.) TRADING RANGE CHANNELS ALONG TOPS AND BOTTOMS

The data shown in the trading range provides a good opportunity to coverthe essentials of Sequential Art. First there is the ideal purpose, which in thecase of trend channels is to outline the expected future course of price behav-ior... the idea is to define and extrapolate. The form employed was that ofdisplaying in graphic panels price behavior over time, which is to say, to createa chart. A different type of form could have been the depiction of the trendthrough averaging and simplifying the data into a moving average. Thus thetechnical analyst as a practical artist has choices to make in the selection ofform.

Another artful choice is the structure of the sequencing of chart data overtime. A trend channel only makes sense if it has a beginning and an ending. Inthe case of the Cal Tech Experiment there were three structural sections: thetrend of prices that reflected the progress, the growth, of speculation. Thenthere was a section labelled the dissipative gradient. That panel too could have

been made even more distinctive off through a change in colors, say from greentoyellow. Then the third section that could have been framed was the panic/crash in prices, and this third panel could have been further seperated with theaddition of the color red.

The artist / technician would thus set up a sequence of meaningful parts orpatterns that taken together would tell the story of “boom and bust” upon thesurface of the market. At a deeper level of analysis the separation into threedistinct yet interdependant sequential panels fits with the technicians vocabu-lary and the iconography of chart patterns that have been established throughexperience to communicate meaning as to the present position and probablefuture trend of a market. The separations into a sequence of separate panelsreally clarify the picture and tell the story of the market.

3. DESCENDING PRICE PEAKS

In this case we cut incisively into the available chart to abstract a sequentialorder of events that the technical analyst then moulds into a pattern. An icon forsymbolizing the motion into the future and in the downward direction definedmight be the famous abstract art depiction of Marcel Duchamp’s “A Nude De-scending a Staircase.” Indeed the entire “cubist movement” in visual art mightbe a rich area for a technical analyst to study. The parallels of the cubist artform and the technical analyst are strong and suggest that borrowing from artdepicting motion might pay the technical analyst a large dividend. Among otherthings the analyst can become sensitized “connecting the dots” of the descend-ing price peaks reveals a picture plane and gives closure to the unifying proper-ties and makes the viewer more aware of the design, the trend, as a whole ratherthan simply the individual components. This in effect is the beauty of trendlines.In support of simple trendlines it has been observed that “Duchamp, more con-cerned with the idea of motion than the sensation, would eventually reducesuch concepts as motion to a single line.”5

The moment-to-moment and action-to-action progression of the abstrac-tion of “triple descending peaks,” does not require too much involvement bythe viewer to interpret the meaning. It is clear-cut, decisive and powerful. Fur-thermore, the actions and intentions of the buyers and sellers which underly thedescending peaks lend themselves to a common sense interpretation for grasp-ing the implication of descending price peaks. Implicit in the foregoing conclu-sion is the realization that effective chart interpretation involves the analystwho identifies and frames sequences and then the observer who reads and in-ternalizes the sequences of panels and labels to inform himself/herself of themotion revealed and the action required.

4. CATASTROPHE PANICS CAUSING PRICE GAPS

The discontinuity of price transaction behavior creates visual gaps or gut-ters that separate panels of price action. It is the acute imbalance betweensupply and demand which creates those gaps. Sequential Art identifies thesegaps or separations where nothing has been recorded as gutters. What attractsthe analysts attention is the comparison of panels of price action before andafter a gap. Why? Because the comparison has forecasting implications. Thegaps or gutters fracture both time and space offering a jagged, staccato rhythmof unconnected moments. But the observers ability to construct continuity acrosspanels generates the ability to mentally create closure. Like Sequential Art,this illustration of gap analysis reinforces that technical market analysis of chartbehavior is very much an interplay between the observer and the observed.

Footnotes

1 Scott Mc Cloud, Understanding Comics: The Invisible Art, The KitchenSink Press CA Division of Harper and Collins) 1993

2 Those interested may read a comment, in french, in Paranque (2004).3 “La forme salaire maintient les salariés dans la méconnaissance du travail

accompli” (page 255 Salais et alii) dans la mesure où ce travail accomplis’exprime dans la valeur produite une fois les consommations intermédiairespayées, à savoir la valeur ajoutée ( voir page 227 et les travaux de PaulBoccara sur le sujet, 1985).

Page 10: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

9JOURNAL of Technical Analysis • Summer-Fall 2004

4 Scott Mc Cloud, Understanding Comics: The Invisible Art, The KitchenSink Press CA Division of Harper and Collins) 1993

5 Mc Cloud, page 108.

AppendixFigure 1. A Cusp Catastrophe Model of a Stock Exchange

Figure 2. Dissipative Gradient

Figure 3. The Overall Results of the Experiment

Figure 4. Applying Technical Analysis

Figure 5. Fear vs. Greed Juxtaposed

Figure 6. Trading Range

Page 11: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 200410

Figure 7. Descending Prices Peaks

Figure 8. Catastrophe Panic Causing Price Gaps

Figure 9. Mental Discipline Needed to Win the “GreaterFool” Game

Footnotes

1 Scott Mc Cloud, Understanding Comics: The Invisible Art, The KitchenSink Press CA Division of Harper and Collins) 1993

2 Those interested may read a comment, in french, in Paranque (2004).3 “La forme salaire maintient les salariés dans la méconnaissance du travail

accompli” (page 255 Salais et alii) dans la mesure où ce travail accomplis’exprime dans la valeur produite une fois les consommations intermédiairespayées, à savoir la valeur ajoutée (voir page 227 et les travaux de PaulBoccara sur le sujet, 1985).

4 Scott Mc Cloud, Understanding Comics: The Invisible Art, The KitchenSink Press CA Division of Harper and Collins) 1993

5 Mc Cloud, page 108.

About the Authors

DR. WALTER BAETS

Walter R. J. Baets is Director Graduate Programs at Euromed Marseille- Ecole de Management and Distinguished Professor in Information, Inno-vation and Knowledge at Universiteit Nyenrode, The Netherlands BusinessSchool. He is also director of Notion, the Nyenrode Institute for Knowl-edge Management and Virtual Education. Previously he was Dean of Re-search at the Euro-Arab Management School in Granada, Spain. He gradu-ated in Econometrics and Operations Research at the University of Antwerp(Belgium) and did postgraduate studies in Business Administration atWarwick Business School (UK). He was awarded a Ph.D. from the Univer-sity of Warwick in Industrial and Business Studies.

He pursued a career in strategic planning, decision support and ISconsultancy for more than ten years, before joining the academic world,first as managing director of the management development centre of theLouvain Universities (Belgium) and later as Associate Professor at NijenrodeUniversity, The Netherlands Business School. He has been a Visiting Pro-fessor at the University of Aix-Marseille (IAE), GRASCE (ComplexityResearch Centre) Aix-en-Provence, ESC Rouen, KU Leuven, RU Gent,Moscow, St Petersburg, Tyumen and Purdue University. Most of his pro-fessional experience was acquired in the telecommunications and bankingsector. He has substantial experience in management development activi-ties in Russia and the Arab world.

His research interests include: Innovation and knowledge; Complexity,chaos and change; The impact of (new information) technologies onorganisations; Knowledge, learning, artificial intelligence and neural net-works; On-line learning and work-place learning.

He is a member of the International Editorial Board of the Journal ofStrategic Information Systems, Information & Management and Syst mesd’Information et Management. He has acted as a reviewer/evaluator for anumber of International Conferences (e.g. ECIS an ICIS) and for the EURACE programme. He has published in several journals including the Jour-nal of Strategic Information Systems, The European Journal of OperationsResearch, Knowledge and Process Management, Marketing Intelligence andPlanning, The Journal of Systems Management, Information & Manage-ment, The Learning Organization and Accounting, Management and Infor-mation Technologies. He has organised international conferences in thearea of IT and organizational change.

Walter Baets is the author of “Organizational Learning and KnowledgeTechnologies in a Dynamic Environment” published in 1998 by KluwerAcademic Publishers, and co-author with Gert Van der Linden of “TheHybrid Business School: Developing knowledge management through man-agement learning,” published by Prentice-Hall in 2000. Along with BobGalliers he co-edited “Information Technology and Organizational Trans-formation: Innovation for the 21st Century Organization” also published in

Page 12: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

11JOURNAL of Technical Analysis • Summer-Fall 2004

1998 by Wiley. In 1999, he edited “Complexity and Management: A col-lection of essays,” published by World Scientific Publishing. Recently heco-authored “Virtual Corporate Universities,” published 2003 by KluwerAcademic.

DR. BERNARD PARANQUE

Bernard Paranque is a doctor of economics ( University of Lyon Lumi re- 1984) and holds the “Habilitation ˆ diriger les recherches” (1995). Hebegan his career as an associate economist in an accountancy firm in 1984.

In 1990, he joined the “Banque de France” (French Central Bank) busi-ness department. From 1990 to 2000 he produced papers on the financialstructure of non-financial companies (www.ssrn.com). He was a represen-tative of the Banque de France in the European Committee of Central Bal-ance Sheet Offices between 1993 and 2002.

In 1999, he was on secondment from the Banque de France to the Sec-retary of State to SMEs’ where he was in charge of the “business financing”department. He was also a member of the French delegation to the SMEs’working party of the Business and Environment Committee of the OECD.

His research refer to the “ conomie des conventions” and are focusedon the financial behavior of the non-financial organization and the promo-tion of specific tools and assessment procedures designed to enhance SMEs’access to financing.

He is co-author with Bernard Belletante and Nadine Levratto of “Diversit conomique et mode de financement des PME” published in 2001. He isalso the co-author of “Structures of Corporate Finance in Germany andFrance” with Hans Friderichs in” JahrbŸcher fŸr National konomie undStatistik,” 2001.

He is associate researcher of the CNRS team IDHE-ENS Cachan inParis and member of the New York Academy of Science.

He joins Euromed Marseille Ecole de Management as Professor of Fi-nance and Head of the “Information and finance” department.

DR. HENRY O. PRUDEN

Hank Pruden is a visiting scholar at Euromed Marseille Ecole de Man-agement, Marseille, France during 2004-2005. Professor Pruden is a pro-fessor in the School of Business at Golden Gate University in San Fran-cisco, California where he has been teaching for 20 years. Hank is morethan a theoretician, he has actively traded his own account for the past 20years. His personal involvement in the market ensures that what he teachesis practical for the trader, and not just abstract academic theory.

He is the Executive Director of the Institute of Technical Market Analy-sis (ITMA). At Golden Gate he developed the accredited courses in techni-cal market analysis in 1976. Since then the curriculum has expanded toinclude advanced topics in technical analysis and trading. In his coursesHank emphasizes the psychology of trading and as well as the use of tech-nical analysis methods. He has published extensively in both areas.

Hank has mentored individual and institutional traders in the field oftechnical analysis for many years. He is presently on the Board of Directorsof the Technical Securities Analysts Association of San Francisco and ispast president of that association. Hank was also on the Board of Directorsof the Market Technicians Association (MTA). Hank has served as vicechair, Americas IFTA (International Federation of Technical Analysts): IFTAeducates and certifies analysts worldwide. For eleven years Hank was theeditor of The Market Technicians Association Journal, the premier publica-tion of technical analysts. From 1982 to 1993 he was a member of the Boardof Trustees of Golden Gate University.

Page 13: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 200412

Page 14: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

13JOURNAL of Technical Analysis • Summer-Fall 2004

The most basic tenet of contrarian investingis that one should buy when others are fearfuland sell when they are eager to buy.

The definitions of “fearful” and “eager” areopen to interpretation, but one assumption thathas persisted over the decades is that low levelsof cash reserves held at mutual fund firms wasa sign of excessive optimism. Looking at therelationship between cash reserves and the risk-free rate of return, however, suggests that port-folio manager sentiment is not the only – or per-haps even the largest – component of cash re-

serve levels. By backing out the effects of interest rates, we can get a better feelfor the sentiment of these portfolio managers, as well as potential stock marketreturns going forward.

Cash Reserves vs. the Risk-Free RateAs of June 2004, liquid assets of stock mutual funds, expressed as a per-

centage of total net assets, stood at 4.3%. This level of reserves, relative to totalassets, was one of the lowest in the history of reported data. At the time thefigures were released, there was a great deal of media attention focused on theidea that fund managers in the United States were overly enthusiastic about theprospect of future gains in the equities market, and thus the market was likelygoing to have difficulty making significant advances. The logic of such anargument may be sound, but a look into another, perhaps more important, fac-tor sheds some light on why cash levels at mutual funds were so low.There are many reasons why a fund would hold a low level of cash:● They believe the market is going higher and want to be as fully invested as

possible.

● They use derivative securities (such as futures and options) and don’t needactual cash on hand in order to hedge their portfolios.

● Their charter (or a mandate from investors or management) requires thatthey remain as invested as possible, having enough cash on hand only tomeet expected redemptions. They are not expected to time the market, onlyfind good stocks. With the improved reporting systems now in place at somefund firms, portfolio managers can see redemptions on virtually a real-timebasis, reducing the likelihood that they will wake up one day with a cashcrunch.

● The increased influence of index funds precludes market timing. Thesemanagers aren’t expected to give investors a positive absolute return, theyare only expected to beat their respective benchmark index. Having a highlevel of cash increases their chances of underperforming their benchmarkin a rising market.

● There aren’t many other instruments available that would give their investorsan acceptable reward for the risk they are taking.It is on that last point that I wish to focus. When short-term interest rates are

high, mutual funds have an incentive to hold cash. If there is a risk-free invest-ment that will pay an 8% return, is it unreasonable to expect a fund manager toshift funds there as opposed to risking them in the equities market where theymay get an 8% return during a good year, but with a great deal more risk? Mostof us would surely switch to the risk-free opportunity. For these purposes, wewill use the yield on 90-day Treasury Bills as the risk-free rate of return.

This assumption is certainly supported by the numbers. From 1954 through2003, the correlation between mutual fund cash levels and the 90-day T-Billrate was 0.74, which means that the prevailing level of interest rates can theo-retically explain 55% of why mutual fund cash levels are where they are. Fig-ure 1 shows this correlation - there is a clear upward slope to the scatter plot,with minimal variation.

FIGURE 1

Correlation Between Cash Level and Risk-Free Rate1954 - 2003

With 591 data points, the probability of the correlation between cash levelsand interest rates being due to chance alone is essentially zero.

Using regression analysis, we can see the relationship between interest ratesand cash reserves. This will allow us to determine what an “appropriate” levelof cash reserves may be given a certain interest rate, which we can then use tocompare to current cash reserves. If current reserves are too low given prevail-ing rates, then fund managers may be overly optimistic; if they are too high,then they may be excessively pessimistic.

Using data from 1954 through 2003, the regression formula for the rela-tionship between interest rates and mutual fund cash reserves is:

y = 0.4978x + 4.5464

where

y = expected cash reservex = current rate on 90-day T-Bills

We can round off these figures and still retain the usefulness of the formula.Put into different terms, the regression formula tells us that cash reserves dur-ing any given month should be approximately 4.5% plus 50% of the currentyield on 90-day T-Bills. Theoretically, if 90-day TBills were yielding 0%, thenmutual funds would be expected to carry 4.5% of their assets in liquid invest-ments. This is a “baseline” amount of cash, presumably needed to cover ex-penses, redemptions and the like.

Mutual Fund Cash Reserves, the Risk-Free Rate

and Stock Market Performance

Jason Goepfert

Cash reserves at mutualfunds in the United Statesreached historically low

levels in 2004. Thetraditional interpretation

suggests that fund managerswere too optimistic on likely

future stock market gains.But is the traditional

interpretation accurate?

2

Page 15: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 200414

Figure 2 uses the regression formula to show what percentage of cash wewould expect mutual funds to hold given a range of values in T-Bill Yields.

FIGURE 2

T-Bill Yields and Expected Cash Reserves90-Day T-Bill Yield Expected Cash Reserve

1.0% 5.0%2.0% 5.5%3.0% 6.0%4.0% 6.5%5.0% 7.0%6.0% 7.5%8.0% 8.5%

10.0% 9.5%

We know that in June 2004, cash reserves were at 4.3% of total assets. OnJune 30th, the yield on 90-Day T-bills was 1.31%. By plugging that value intothe regression formula, we estimate that mutual funds should have carried 5.20%of their assets in cash. By taking the difference between what was expected andwhat was fact, we can conclude that mutual funds were carrying a “cash defi-cit” of 0.90%:

Actual - Expected = Surplus/(Deficit)4.30% 5.20% (0.90%)

By going back and comparing actual levels of cash to those that were ex-pected given the prevailing level of interest rates, we can get a better handle onthe sentiment of portfolio managers without the distorting effects of interestrates on cash reserves. The difference between actual and expected reserveswill show that fund managers are giving a “premium” or “discount” to cash,and should create an effective contrary sentiment indicator. For purposes ofbrevity, we will call the difference between actual and expected cash reservesRAPAD (Rate Adjusted Premium And Discount).

Adjusted Reserves as Sentiment IndicatorDuring the 49 years of the study, the mean value of the RAPAD measure

was 0.0%, with a standard deviation of 1.5%. The distribution of readings fromthis measure hugs closely to a normal bell curve, so standard statistical mea-sures should apply. If we look at how the market, defined as the S&P 500 cashindex, performed after abnormal readings, we can begin to get an idea of howeffective this measure may be at highlighting high- or low risk times in thestock market. For these purposes, we are defining “abnormal” as any readingmore than 1.5 standard deviations away from the mean, which in this case wouldequate to all RAPAD readings less than -2.25% and greater than 2.25%. Putanother way, we will see how the market performed after any month whenmutual funds held 2.25% more or less cash than they should have held giventhe prevailing level of interest rates.

FIGURE 3

S&P 500 Performance After RAPAD Reading of -2.25% orBelow (Extreme Cash Discount)

6 Months 12 Months 18 Months 24 MonthsLater Later Later Later

Average Return -3.0% -6.1% -5.5% -1.8%Percent Positive 31% 22% 36% 50%

FIGURE 4

S&P 500 Performance After RAPAD Reading of +2.25% orAbove (Extreme Cash Premium)

6 Months 12 Months 18 Months 24 MonthsLater Later Later Later

Average Return 8.3% 14.1% 19.4% 23.0%Percent Positive 81% 89% 98% 100%

Figure 3 shows how the S&P 500 performed for up to 2 years after mutualfunds were holding cash reserves that were at least 2.25% less than they shouldhave been given the level of short-term interest rates at the time. The primaryreason for giving cash such a discount was likely that the fund managers feltvery optimistic about the future gains they were likely to make in the stockmarket, so they felt the need to be as fully invested as possible. As we can seefrom the table, this optimism was generally unwarranted. If we look at theresults after 12 months, the S&P 500 showed an average return of -6.1%.

Looking at the months where cash levels were in a “normal” range (mean-ing RAPAD readings within 1.5 standard deviations of the mean), the average12 month return in the S&P 500 was 8.7% during the study period. One-yearreturns after extreme cash discounts therefore underperformed an average re-turn by 14.8%. We also see from Figure 3 that the S&P 500 was higher 12months later only 22% of the time. There were 36 months that were consideredto show an extreme cash discount, and only 8 times out of those 36 instanceswas the S&P 500 higher one year later.

Figure 4 gives us the performance after periods of extreme cash premiums,meaning those times when fund managers held at least 2.25% more cash thanexpected. The results here are markedly different from Figure 3. After 12months, the S&P 500 was an average of 14.1% higher, outperforming an aver-age month by 5.4%. Out of the 53 months that qualified as exhibiting an ex-treme cash premium, 47 lead to a higher market one year later, for a “successrate” of 89%. See Appendix A for a detailed list of all extreme RAPAD read-ings during the study period.

Figure 5 below shows the correlation between RAPAD readings and S&P500 returns 12 months later.

FIGURE 5

S&P 500 12-MONTH RETURNS AND RAPAD READINGS

The correlation between RAPAD readings and returns in the S&P 500 oneyear later is 0.32, suggesting that if we knew nothing else but what the currentRAPAD reading was, we could improve our prediction of where the S&P 500would close one year later by about 11%.

Why Adjust for Interest Rates?A valid question is why do we have to adjust for interest rates at all - aren’t

cash levels by themselves a good enough indicator of excessive optimism orpessimism by fund managers? Monitoring cash levels on their own can indeedbe an adequate contrary guide. However, there have been times where adjust-ing for interest rates has given a much better indication of excess. Figure 6highlights just such an instance.

Page 16: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

15JOURNAL of Technical Analysis • Summer-Fall 2004

FIGURE 6

CASH LEVELS VS. RATE-ADJUSTED CASH LEVELS

Figure 6 shows us a plot of the S&P 500 (top scale), the raw values ofmutual fund cash reserves (middle scale) and the RAPAD measure of cash pre-miums and discounts (lower scale).

On the chart, Point A corresponds to July 1976. At the time, 90-Day T-Billswere yielding about 5.2%. According to the regression formula, mutual fundsshould have been holding about 7.1% of their assets in cash. However, theywere holding only 4.7% cash, so they were holding about 2.4% less cash re-serves than they should have been given the level of short-term rates at thetime. This was a show of extreme optimism on the part of fund managers, andthe S&P refused to accommodate by declining into the beginning of 1978.

By early 1980, managers had built up their cash reserves once more, just intime for a stiff market rally over the next year. In January 1981 (Point B on thechart), 90-Day T-Bill rates had climbed all the way up to 14.6%, giving fundmanagers a very enticing incentive to hold large amounts of cash. They did havesignificantly more cash then than they did in 1976. At Point A, cash levels werearound 4.7%, as stated above. At Point B, cash levels stood at 8.3%. Taken on itsown, one could have easily concluded that fund managers were nowhere near asoptimistic at Point B than they were at Point A. However, when we factor inprevailing interest rates, theoretically fund managers should have been holding11.8% of their assets in cash at the time. Since they only had 8.3% in cashreserves, they were once again deficient by an extreme amount (3.5%). This toldus that fund managers were indeed too optimistic, contrary investors shouldhave expected a market decline (or at least difficulty making much headway),and the S&P ultimately declined sharply over the next one and a half years.

Out-of-Sample Testing and OtherTechnical Analysis Applications

In order to get an idea of how this method would have worked in real-time(without the perfect knowledge of hindsight), out-of-sample testing is neces-sary. This is where we take only a portion of the data as the look-back periodfor the regression formula, and then test to see how it would have predictedfuture moves in the S&P 500.

Using the period from 1954 through 1976 as the lookback period, the re-gression line between mutual fund cash levels and the 90-day T-Bill rate re-

mained quite consistent with what was presented above:

Y = 0.5336 + 4.0163

where

y = expected cash reservex = current rate on 90-day T-BillsWhen we take this formula and determine the cash deficit or cash surplus

from 1977 through 2003 (the out-of-sample period), we can determine howwell it would have predicted future stock market returns. In Figure 5, we showedthe correlation between RAPAD readings and one-year S&P 500 returns asbeing 0.32. Using this out-of-sample test, the correlation from 1976 through2003 dropped to 0.21. However, given that correlation and the number of datapoints in the sample, once again the chances are virtually zero that this rela-tionship occurred by chance alone.

In Figures 3 and 4, we showed how the S&P 500 performed after the cashpremium or discount reached extreme levels. Taking the same approach with theout-ofsample test, the results were very consistent. Here, “extreme” is consid-ered to be any cash discount of -1.75% or less or any cash premium of +1.75%or more. One year after extreme cash discounts, the S&P 500 was higher 25% ofthe time, with an average return of -5.6%. One year after extreme cash premi-ums, the S&P 500 was higher 89% of the time, with an average return of 12.1%.

These results compare very favorably to those obtained previously, sug-gesting that the predictive power of this approach held up even during the out-of-sample testing. As with most contrary indicators, the RAPAD measure be-came most effective when it was giving extreme readings one way or the other.

It may be possible to achieve similar or even superior market-timing resultsby applying basic technical analysis to the cash levels themselves, without theneed to adjust for interest rates. To test this, we used a simple moving averagecrossover system applied to the cash balances. We went long the S&P 500 cashindex when a 12-month average of cash balances fell below their 60-monthaverage and then sold when the 12-month average crossed back above the 60-month average.

Such a system did have some merit, as it would have kept an investor out ofthe bad markets of 1974 and 1987. It also would have kept one long during theroaring bull market of the 1990s. However, as with most crossover systems,whipsaws were an issue. Out of the 7 signals, 3 of them were losers, losing anaverage of 9%. The four winners, however, gained an average of 76% (duemainly to the 222% gain from the 1990’s).

If we used a very simple RAPAD method of going long when RAPAD firstcrossed above +2.25 and selling when it first crossed below -2.25 (so we wouldbe buying when mutual fund cash reserves first became extremely high, and wewould hold until they became extremely low), there would have been only fourtrades by this strict methodology. All four were winners, for an average gain of155% (skewed by a 450% gain from the system going long in October 1985 andholding through March 1998). Since the data is released to the public with aone-month delay, we used the S&P closing prices as of the date one would havereceived the data, which reduced the returns somewhat but kept the trades muchmore based in reality. See Appendix B for a chart of each of the trade signals.

Other FactorsIn the beginning paragraph, we highlighted several other factors, besides

competing assets, which may affect mutual fund cash reserves. We have lookedat what relation some of those have on cash reserves, and there does seem to bea correlation. However, since many of these developments are so new, we donot have enough data to draw reliable conclusions. Still, it is instructive todiscuss the impacts of these variables on cash reserves so that we can morereadily observe their impact going forward.

The listed options market has grown steadily over the past 10 years. In1993, the Chicago Board Options Exchange was clearing approximately9,000,000 options contracts on a monthly basis. By the end of 2003, that vol-ume had tripled. The correlation between monthly options volume on the CBOE

Page 17: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 200416

and mutual fund cash levels from 1993 - 2003 was -0.66. This tells us that therewas a large negative correlation between option volume and cash levels - asoption volume increased, cash levels decreased. This could be a significantfactor, however we are limited by a lack of reliable option volume data. Also,interest rates during this time were steadily decreasing. As we saw above, inter-est rates have had a definite impact on cash levels over nearly 50 years of data,so it is difficult to determine if cash levels were impacted more by option activ-ity or by interest rates.

We also checked the correlation between cash reserves and futures marketactivity. For the latter, we used commercial trader positions (both long and short)in the large S&P 500 futures contract from 1986 - 2003. According to the Com-modity Futures Trading Commission (CFTC), a commercial trader is a largetrader (the definition of “large” has changed over the years) engaged in the fu-tures market for the specific purpose of hedging the trader’s daily business ac-tivity. Comparing month-end positions in the Commitments of Traders report,there was a correlation of -0.80 between the futures positions and mutual fundcash reserves. This is a very tight correlation and tells us that as futures posi-tions increased, cash reserves decreased and vice-versa. Once again, however,we are limited by the fact that interest rates declined for most of this period.

It would be possible to use a multiple regression formula to determine wheremutual fund cash reserves should be, instead of using only interest rates asdescribed above. However, until more time goes by where we see varying lev-els of option and futures activity, the utility of that exercise is probably limited.

Another likely factor in cash balances is the impact of rising or falling mar-ket prices themselves, regardless of fund manager sentiment. When market pricesrise, we should see a decline in the percentage of assets held in cash, simplybecause the total portfolio is worth more than it was before.

To check whether this may be the case or not, we looked at the month-to-month percentage change in the S&P 500 and compared it to the month-to-month change in mutual fund cash balances. The correlation was -0.38, whichmeans that it could be statistically possible for one factor to account for around14% of the movement in the other. Since the correlation is negative, it helps toconfirm the theory that cash balances would fall when prices rise and vice versa.The same negative correlation holds (although it falls to -0.20) when we look atthis month’s change in the S&P 500 and next month’s change in cash balances.This is another significant factor and should be included in any discussion aboutwhether cash levels have moved an inordinate amount in any given period.

However, it is important to distinguish those times when funds are holdinglow levels of cash because they are overly optimistic versus those times whenthey are holding low cash reserves because there are few other alternatives.With 90-Day T-Bill rates yielding barely above 1% at the time, June 2004 wascertainly one of the latter.

This does not mean the stock market cannot - or should not - decline, itsimply means that overzealous fund managers are not necessarily a catalyst. Ifwe see rates rise significantly in 2004, but cash reserves at mutual funds holdsteady or decline, then there may be some real evidence that fund managers areexcessively optimistic.

As we saw from Figure 3, overly optimistic portfolio managers are a goodsign that whatever rally is in place may be about to lose steam.

Recent ActivityAs of June 2004, mutual funds held 4.3% of their total assets in liquid as-

sets. Given the low level of short-term interest rates at the time, it is not entirelyunexpected that cash reserves would be so low. Still, anytime the absolute levelof cash is low, we believe investors should be worried. While fund companieshave better reporting systems now than they did 20 years ago, there is still thepossibility of a “cash crunch,” whereby unexpected redemptions cause heavyselling by mutual funds to meet the redemptions since they do not have ad-equate cash on hand to cover them.

This of course would exacerbate the market decline that is likely the reasonfor the redemptions in the first place.

SourcesInvestment Company Institute (http://www.icinet.net/)

The Federal Reserve Bank of St. Louis (http://www.stlouisfed.org/)

Appendix AThe table below outlines each month where the RAPAD reading was con-

sidered extreme. The table gives the month of the occurrence, the S&P 500cash index level at the time, the RAPAD reading for that month, and the returnin the S&P 500 cash index 6, 12, 18 and 24 months later.

S&P 500 Return6 Mo. 12 Mo. 18 Mo. 24 Mo.

Date S&P 500 RAPAP Later Later Later Later

All Occurrences with RAPAD Readings of -2.25 and Below (Extreme Cash Deficit)1/30/81 129.55 -3.55 1.1% -7.1% -17.3% 12.2%3/31/00 1498.58 -3.39 -4.1% -22.6% -30.5% -23.4%4/30/81 132.81 -3.22 -8.2% -12.3% 0.7% 23.8%7/31/81 130.92 -3.06 -8.0% -18.2% 11.0% 24.2%5/29/81 132.59 -2.99 -4.7% -15.6% 4.5% 22.5%2/27/81 131.27 -2.98 -6.5% -13.8% -9.0% 12.8%2/29/00 1366.42 -2.94 11.1% -9.3% -17.0% -19.0%8/31/81 122.79 -2.94 -7.9% -2.7% 20.6% 33.9%1/31/00 1394.46 -2.90 2.6% -2.0% -13.1% -19.0%8/31/00 1517.68 -2.90 -18.3% -25.3% -27.1% -39.6%1/31/73 116.03 -2.84 -6.7% -16.8% -31.6% -33.7%6/30/81 131.21 -2.82 -6.6% -16.5% 7.2% 27.8%12/31/99 1469.25 -2.82 -1.0% -10.1% -16.7% -21.9%9/30/71 98.34 -2.81 9.0% 12.4%1 3.4% 10.3%4/30/98 1111.75 -2.77 -1.2% 20.1% 22.6% 30.6%12/29/72 118.05 -2.77 -11.7% -17.4% -27.1% -41.9%11/28/80 140.52 -2.69 -5.6% -10.1% -20.4% -1.4%6/30/71 98.70 -2.62 3.4% 8.6% 19.6% 5.6%5/29/98 1090.82 -2.58 6.7% 19.3% 27.3% 30.2%5/31/71 99.63 -2.56 -5.7% 9.9% 17.1% 5.3%12/31/80 135.76 -2.55 -3.4% -9.7% -19.3% 3.6%7/31/00 1430.83 -2.55 -4.5% -15.3% -21.0% -36.3%3/31/98 1101.75 -2.55 -7.7% 16.8% 16.4% 36.0%11/30/99 1388.91 -2.51 2.3% -5.3% -9.6% -18.0%6/30/00 1454.60 -2.49 -9.2% -15.8% -21.1% -32.0%5/31/72 109.53 -2.48 6.5% -4.2% -12.4% -20.3%4/28/00 1452.43 -2.46 -1.6% -14.0% -27.0% -25.9%7/30/71 95.58 -2.44 8.7% 12.4% 21.4% 13.2%6/30/98 1133.84 -2.42 8.4% 21.1% 29.6% 28.3%9/30/76 105.24 -2.42 -6.5% -8.3% -15.2% -2.6%7/30/76 103.44 -2.38 -1.4% -4.4% -13.7% -2.7%10/29/76 102.90 -2.33 -4.3% -10.3% -5.9% -9.5%6/30/76 104.28 -2.33 3.0% -3.6% -8.8% -8.4%3/31/81 136.00 -2.28 -14.6% -17.7% -11.5% 12.5%8/31/76 102.91 -2.26 -3.0% -6.0% -15.4% 0.4%9/29/00 1436.51 -2.26 -19.2% -27.5% -20.1% -43.2%Average Return -3.0% -6.1% -5.5% -1.8%Number of Occurrences 36 36 36 36Number of Positive Occurrences 11 8 13 18Positive Occurrences as % of Total 31% 22% 36% 50%Maximum Return 11.1% 21.1% 29.6% 36.0%Minimum Return -19.2% -27.5% -31.6% -43.2%

Page 18: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

17JOURNAL of Technical Analysis • Summer-Fall 2004

All Occurrences with RAPAD Readings of +2.25 and Above (Extreme Cash Surplus)10/29/93 467.83 2.25 -3.6% 1.0% 10.0% 24.3%11/29/74 69.97 2.26 30.3% 30.4% 43.2% 45.9%11/28/86 249.22 2.27 16.4% -7.6% 5.2% 9.8%1/31/94 481.61 2.28 -4.8% -2.3% 16.7% 32.1%5/30/80 111.24 2.29 26.3% 19.2% 13.6% 0.6%6/30/88 273.50 2.29 1.5% 16.3% 29.2% 30.9%6/30/58 45.24 2.31 22.0% 29.2% 32.4% 25.8%4/30/86 235.52 2.32 3.6% 22.4% 6.9% 11.0%6/30/93 450.53 2.35 3.5% -1.4% 1.9% 20.9%7/31/86 236.12 2.37 16.1% 35.0% 8.9% 15.2%6/29/90 358.02 2.40 -7.8% 3.7% 16.5% 14.0%7/31/70 78.05 2.41 22.8% 22.5% 33.2% 37.6%8/31/88 261.52 2.42 10.5% 34.4% 26.9% 23.3%7/31/90 356.15 2.43 -3.4% 8.9% 14.8% 19.1%5/31/93 450.19 2.44 2.6% 1.4% 0.8% 18.5%6/30/92 408.14 2.48 6.8% 10.4% 14.3% 8.9%8/29/86 252.93 2.48 12.4% 30.4% 5.9% 3.4%10/30/92 418.68 2.48 5.1% 11.7% 7.7% 12.8%11/30/92 431.35 2.53 4.4% 7.1% 5.8% 5.2%9/30/88 271.91 2.54 8.4% 28.4% 25.0% 12.6%10/31/66 80.20 2.55 17.2% 16.3% 21.5% 28.9%7/30/93 448.13 2.55 7.5% 2.3% 5.0% 25.4%2/29/88 267.82 2.56 -2.4% 7.9% 31.2% 23.9%10/31/86 243.98 2.57 18.2% 3.2% 7.1% 14.3%10/31/85 189.82 2.57 24.1% 28.5% 51.9% 32.6%2/28/94 467.14 2.58 1.8% 4.3% 20.3% 37.1%7/29/88 272.02 2.59 9.4% 27.2% 21.0% 30.9%9/30/92 417.80 2.61 8.1% 9.8% 6.7% 10.7%7/31/92 424.21 2.67 3.4% 5.6% 13.5% 8.0%5/31/88 262.16 2.75 4.4% 22.3% 32.0% 37.8%11/30/89 345.99 2.78 4.4% -6.9% 12.7% 8.4%2/26/93 443.38 2.79 4.6% 5.4% 7.2% 9.9%1/29/88 257.07 2.85 5.8% 15.7% 34.6% 28.0%5/31/90 361.23 2.90 -10.8% 7.9% 3.9% 15.0%3/31/88 258.89 2.91 5.0% 13.9% 34.9% 31.3%9/30/86 231.32 2.97 26.1% 39.1% 11.9% 17.5%8/31/92 414.03 2.98 7.1% 12.0% 12.8% 14.8%4/30/87 288.36 3.00 -12.7% -9.4% -3.3% 7.4%1/31/90 329.08 3.10 8.2% 4.5% 17.8% 24.2%3/31/93 451.67 3.11 1.6% -1.3% 2.4% 10.9%2/28/90 331.89 3.19 -2.8% 10.6% 19.1% 24.3%10/30/87 251.79 3.23 3.8% 10.8% 23.0% 35.2%4/29/88 261.33 3.28 6.8% 18.5% 30.2% 26.6%3/30/90 339.94 3.47 -10.0% 10.4% 14.1% 18.8%4/30/93 440.19 3.51 6.3% 2.4% 7.3% 16.9%8/31/90 322.56 3.67 13.8% 22.6% 27.9% 28.4%12/31/90 330.22 3.75 12.4% 26.3% 23.6% 31.9%9/30/74 63.54 3.99 31.2% 32.0% 61.7% 65.6%11/30/87 230.30 4.06 13.8% 18.8% 39.2% 50.2%4/30/90 330.80 4.08 -8.1% 13.5% 18.6% 25.4%11/30/90 322.22 4.36 21.0% 16.4% 28.9% 33.9%9/28/90 306.05 4.60 22.6% 26.7% 31.9% 36.5%10/31/90 304.00 4.81 23.5% 29.1% 36.5% 37.7%

Average Return 8.3% 14.1% 19.4% 23.0%Number of Occurrences 53 53 53 53Number of Positive Occurrences 43 47 52 53Positive Occurrences as % of Total 81% 89% 98% 100%Maximum Return 31.2% 39.1% 61.7% 65.6%Minimum Return -12.7% -9.4% -3.3% 0.6%

The figures below give the average of all data during the study periodAverage Return 4.1% 8.1% 12.2% 16.6%Number of Occurrences 585 579 573 567Number of Positive Occurrences 386 403 419 448Positive Occurrences as % of Total 66% 70% 73% 79%

Appendix B

Trade signals using the extremes in the RAPAD mutual fund cash levelas entries and exits.

Biography

Jason Goepfert is the President and CEO of Sundial Capital Research,Inc., a firm focused on the research and practical application of masspsychology to the financial markets. Prior to founding Sundial, Jasonmanaged the operations of a large discount brokerage firm and a multi-billion dollar hedge fund, experience which firmly planted the idea thatlogic rarely trumped emotion when it came to traders’ investment decisions.Sundial trades proprietary capital and releases its research to institutionalclients and individual investors via its web site, www.sentimenTrader.com

Page 19: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 200418

Page 20: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

19JOURNAL of Technical Analysis • Summer-Fall 2004

This paper introduces a new volume-price measurement tool that could pro-vide the clearest picture of the volume-price relationship of any indicator de-vised: the Volume-Price Confirmation Indicator, or VPCI. The VPCI revealsthe intrinsic relationships between price and volume as a validation or contra-diction of price trends. In other words, VPCI identifies the inherent relation-ship between price and volume as harmonious or inharmonious. This studyshows that investors who use the VPCI properly may increase their profits andthe reliability of their trades, while simultaneously reducing risk.

In the exchange markets, price results from an agreement between buyersand sellers to exchange, despite their different appraisal of the exchanged item’svalue. One opinion may be heavily loaded in meaning and purpose; the othermay be pure nonsense. However, both are equal as far as the market is con-cerned. Price represents the convictions, emotions and volition of investors. I tis not a constant, but rather is changed and influenced by information, opinionsand emotions over time. Market volume represents the number of shares tradedover a given period. It is a measurement of participation, enthusiasm or inter-est in a given security. Price and volume are closely linked, yet are independentvariables. Together, these individually derived variables give better indica-tions of supply and demand than either can provide independently.

Volume can be thought of as the force that drives the market. Force or vol-ume is defined as power made operative against support or resistance.i In phys-ics, force is a vector quantity that tends to produce an acceleration.ii The sameis true of market volume. Volume substantiates and mediates price. When vol-ume increases, it confirms price; when volume decreases, it contradicts pricemovements. In theory, increases in volume should precede significant pricemovements, giving quicker downside and upside signals. This basic tenet oftechnical analysis has been repeated as a mantra since the days of Charles Dow.iii

When stocks change hands, there is always an equal amount of buy volumeto sell volume on executed orders. When the price moves up, it reflects rea-soned demand or the fact that buyers are in control. Likewise, when the pricemoves down it infers supply or that sellers are in control. Over time, thesetrends of supply and demand form accumulation and distribution patterns. VPCIwas designed to expose price and volume relationships as validation or contra-diction of price trends. The following pages discuss the derivation and compo-nents of VPCI, explain how to use VPCI, review comprehensive testing of VPCIand present further applications.

Deriving the ComponentsThe market is likened to an orchestra without a conductor. By mediating

the intrinsic relationship between price and volume, the VPCI attunes price andvolume into an observable accord. Simply put, this could be considered theharmony between price and volume. The basic concept is that measuring thedifference between volume-weighted moving averages (VWMAs) and the cor-responding simple moving average (SMA), reveals a precise level of price-volume confirmation or price-volume contradiction. This occurs because vol-ume-weighted averages weight closing prices in exact proportion to the vol-ume traded during each time period.

Since VWMAs are essential to understanding the VPCI, it is important todifferentiate them from SMAs. The VWMA was developed to give a moreaccurate account of trends by modifying the SMA. The VWMA measures thecommitment expressed through a closing price, weighted by that day’s corre-sponding volume (participation), compared to the total volume (participation)of the trading range. Although SMAs exhibit a stock’s changing price levels,they do not reflect the amount of participation by investors. However, withVWMAs, price emphasis is directly proportional to each day’s volume, com-

pared to the average volume in the range of study.The VWMA is calculated by weighting each timeframe’s closing price with

the timeframe’s volume compared to the total volume during the range:

volume-weighted average = sum {closing price (I) * [volume (I)/(totalrange)]}

where I = given day’s action.

Here is an example of how to calculatea two-day moving average, using both SMAand VWMA on a security trading at $10.00with 100,000 shares on the first day and at$12.00 with 300,000 shares on the secondday. The SMA calculation is Day One’sprice plus Day Two’s price divided by thenumber of days, or (10+12)/2, which equals11. The VWMA calculation would be DayOne’s price (10) multiplied by Day One’svolume of the total range expressed as a fraction (100,000/400,000 = 1/4) plusDay Two’s price (12) multiplied by Day Two’s volume of the total range ex-pressed as a fraction (300,000/400,000 = 3/4), which equals 11.5.

Keeping in mind how VWMAs work, an investigation of VPCI may begin.The VPCI involves three simple calculations:

1.) volume-price confirmation/ contradiction (VPC+/-),2.) volume-price ratio (VPR), and3.) volume multiplier (VM).

The first step in calculating VPCI is to choose a long-term and short-termtimeframe. The long-term timeframe number will be used in computing theVPC as the simple and volume-weighted price-moving average, and again incalculating the VM as a simple, volume-moving average. The short-termtimeframe number will be used in computing the VPR as a simple and volume-weighted price-moving average and again in calculating the VM as a simple,volume-moving average.

The VPC is calculated by subtracting a long-term SMA from the sametimeframe’s VWMA. In essence, this calculation is the otherwise unseen nexusbetween volume proportionally weighted to price and price proportionallyweighted to volume. This difference, when positive, is the VPC+ (volume-price confirmation) and, when negative, the VPC- (volume-price contradic-tion). In effect, this computation reveals price and volume symmetrically dis-tributed over time. The result is quite revealing. For example, a 50-day SMAis 48.5, whereas the 50-day VWMA is 50. The difference of 1.5 representsprice-volume confirmation. If the calculation were negative, then it would rep-resent price-volume contradiction. This alone provides purely unadorned in-formation about the intrinsic relationship between price and volume.

The next step is to calculate the volume price ratio. VPR accentuates theVPC+/- relative to the short-term price-volume relationship. The VPR is cal-culated by dividing the short-term VWMA by the short-term SMA. For ex-ample, assume the short-term timeframe is 10 days, and the 10-day VWMA is25, while the 10-day SMA is 20. The VPR would equal 25/20, or 1.25. Thisfactor will be multiplied by the VPC (+/-) calculated in the first step. Volume-price ratios greater than 1 increase the weight of the VPC+/-. Volume-priceratios below 1 decrease the weight of the VPC+/-.

The third and final step is to calculate the volume multiplier. The VM ob-jective is to overweight the VPCI when volume is increasing and underweightthe VPCI when volume is decreasing. This is done by dividing the short-termvolume average by the long-term volume average. As an illustration, assume

Introducing the Volume Price Confirmation Indicator (VPCI):Price & Volume Reconciled

Buff Dormeier, CMT 3SMA simple moving averageVWMA volume-weighted

moving averageVPC (+/-) volume-price

confirmation/contradictionVPR volume-price ratioVM volume muliplier

Page 21: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 200420

the short-term average volume for 10 days is 1.5 million shares a day, and thelong-term volume average for 50 days is 750,000 shares per day. The VM is 2(1,500,000/750,000). This calculation is then multiplied by the VPC+/- after ithas been multiplied by the VPR.

Now we have all the information necessary to calculate the VPCI. TheVPC+ confirmation of +1.5 is multiplied by the VPR of 1.25, giving 1.875.Then 1.875 is multiplied by the VM of 2, giving a VPCI of 3.75. Although thisnumber is indicative of an issue under very strong volume-price confirmation,this information serves best relative to the current and prior price trend andrelative to recent VPCI levels. Discussed next is how best to use the VPCI.

Using VPCI

Confirming Signals

Unlike other volume-price indicators, the VPCI is not a stand-alone tool.Most volume-price indicators may give signals without regard to price trend(although this is not advised). For example, a trader may buy an issue based ona breakout of On Balance Volume, or sell an issue on a Money Flow Indexdivergence in an overbought zone. However, the VPCI gives virtually no indi-cations outside of its relationship to price; it only confirms or contradicts theprice trend. There are several ways to use VPCI in conjunction with price trendsand price indicators. These include a VPCI greater than zero, a rising or fallingVPCI, a smoothed (moving average) rising or falling VPCI, or a VPCI as amultiplier. Table 1 gives the basic VPCI utilizations:

Table 1. VPCI and Price TrendsPrice-Trend

Price VPCI Relationship ImplicationsRising Rising Confirmation BullishRising Declining Contradiction BearishDeclining Rising Confirmation BearishDeclining Declining Contradiction Bullish

VPCI in Action

In our first example (Figure 1), the price trend of SIRI is rising and theVPCI is also rising. Here the VPCI is giving three bullish signals, the mostimportant being that the VPCI is rising. Increasing volume and price confirma-tion demonstrate strengthening commitment to the existing price trend of de-mand. Secondly, VPCI smoothed is rising and the VPCI has crossed above it,indicating momentum within the confirmation. This is a good indication thatthe existing bullish price trend will continue. Last and least important, both theVPCI and VPCI smoothed are above the zero line, indicating a healthy longer-term accumulation. All of these VPCI indications are interpreted as bullish onlybecause SIRI’s prevailing trend is rising.

Figure 1. Bullish Confirmation: SIRI’s Rising Price Trendand Rising VPCI. Bottom Red, Wiggly Line is VPCI.

Smoother Blue Line is VPCI Smoothed

Next, we look at an example of the VPCI giving a bearish contradictionsignal (Figure 2). TASR’s stock price is rising, but the VPCI is falling. Thissituation suggests caution - a significant price correction could be looming be-cause the intrinsic relationship between price and volume is not harmonious.Although price is rising and volume appears supportive, the VPCI is indicatingthat demand is no longer in control. Here two bearish signs are given in thepresence of a rising stock price. Most significantly, both the VPCI and VPCIsmoothed are in downtrends, indicating weakening commitment to the uptrend.Also, both the VPCI and VPCI smoothed are below zero, suggesting an un-healthy uptrend.

Figure 2. Bearish Contradiction: TASR’s Rising PriceTrend and Falling VPCI

A falling stock price and a rising VPCI (Figure 3) is an example of volume-price confirmation. In our illustration, GSK’s stock price is falling and the VPCIis rising, indicating control is clearly in the hands of sellers. The VPCI movesgradually upward, supporting the downward price movement. Gaining momen-tum, the VPCI crosses above zero and eventually through the VPCI smoothed.GSK’s stock price breaks down shortly afterwards on the selling pressure.

Page 22: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

21JOURNAL of Technical Analysis • Summer-Fall 2004

Figure 3. Bearish Confirmation: GSK’s Falling PriceTrend and Rising VPCI

RIMM provides an example of a bullish contradiction. In Figure 4, RIMM’sprice is declining, as is the VPCI. A decreasing VPCI while the price is fallingis usually a sign of increasing demand, especially if the stock has previouslybeen in an uptrend as was the case with RIMM. When RIMM begins to breakdown, the VPCI takes a sharp nosedive, indicating a weak selloff. Once theVPCI bottoms, the bulls regain control of RIMM and the breakdown is re-versed. The VPCI turns upward, confirming the prior uptrend. This is a classicexample of the VPCI indicating a countertrend.

Figure 4. Bullish Contradiction: RIMM’s Falling Priceand a Falling VPCI

Putting it all together, let us take a look at one final example of the VPCI inaction (Figure 5). It’s extremely important to note when using VPCI that vol-ume leads or precedes price action. Unlike most indicators, the VPCI willoften give indications before price trends are clear. Thus, when a VPCI signalis given in an unclear price trend, it is best to wait until one is evident. Thisfinal example is given in a weekly timeframe to illustrate VPCI signals in alonger-term cycle.

At Point 1 in Figure 5, CMX is breaking out and the VPCI confirms thisbreakout as it rapidly rises, crossing over the VPCI smoothed and zero. This isan example of a VPCI bullish confirmation. Later, the VPCI begins to fallduring the uptrend, suggesting a pause within the new uptrend. This smallmovement is a bearish contradiction. At Point 2, CMX’s price falls as theVPCI continues to fall below zero and eventually through the VPCI smoothed,gaining momentum. This is a classic example of a countertrend VPCI bullishcontradiction. At Point 3, the VPCI has bottomed out and with CMX begins to

rise, confirming the last VPCI signal. Later, in Point 3, VPCI moves upward,supporting the higher price movement. By Point 4, CMX breaks through resis-tance, while the VPCI upward momentum accelerates rapidly, crossing the VPCIsmoothed and zero. From this bullish confirmation, one could deduce a highprobability of a price breakout, illustrating bullish confirmation once again.

Figure 5. VPCI in action CMX

Testing the VPCI

Applying the VPCI information to a trading system should improve profit-ability. To evaluate this VPCI hypothesis, it was tested via a trading system,contrasting two moving average systems. The goal of this study was not toachieve optimum profitability but to compare a system using VPCI signals tothat of a system not using them. The crossing of the five-day and 20-day mov-ing averages was used to generate buy and sell signals. The five-day movingaverage represents the cost basis of traders in a one-week timeframe. The 20-day moving average represents the cost basis of traders in a one-monthtimeframe. The shorter moving average is more responsive to current priceaction and trend changes, because it emphasizes more recent price changes.The longer-term moving average comprises more information and is more in-dicative of the longer-term trend. Because its scope is broader, the longer-termmoving average normally lags behind the action of the shorter moving average.When a moving average curls upward, the investors within this timeframe areexperiencing positive momentum. The opposite is true when the moving aver-age curls downward. When the short-term moving average’s momentum is sig-nificant enough to cross over the longer-term moving average, this is an indica-tion of a rising trend, otherwise known as a “buy signal.” Likewise, when theshorter-term moving average’s momentum crosses under the longer-term mov-ing average, a “sell signal” is generated.

Back-tested first was a five- and 20-day crossover system. A long positionis taken when the short-term moving average crosses above the long-term mov-ing average. A short position is taken when the short-term moving averagecrosses under the long-term moving average. These actions tend to representshort-term changes in momentum and trend. In the comparative study, I usedthe same five- and 20-day crossover, but kept only the trades when the VPCIhad crossed over a smoothed VPCI. This indicates a rising VPCI or price con-firmation. The VPCI settings will be the same as the moving averages, 20 daysfor the long-term component and five days for the short-term component. TheVPCI smoothed is the 10-day average of the VPCI.

There are a number of limitations to a study framed this way but thesesettings were chosen deliberately to keep the study simple and uncompromised.First, the five- and 20-day moving average settings are too short to indicate astrong trend. This detracts from the effectiveness of the VPCI as an indicatorof price trend confirmation or contradiction. However, although these settingsare short, they provide more trades than a longer-term trend system, creating amore significant sample size. Also, the VPCI settings at five and 20 days,

Page 23: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 200422

when the price data is only 20 days old (length of the long-term moving aver-age) are too short. By using these time settings, the VPCI may give indicationsahead of the price trend or momentum signals given by the moving average.However, changing the settings could be interpreted as being optimized. Ac-cordingly, a 10-day lookback delay on the VPCI and a five-day lookback delayon the VPCI smooth was installed. This delay gives the VPCI confirmationsignal more synchronicity with the lagging moving average crossover. Ideally,VPCI delays should be “tuned in” to the individual issue. My testing has shownmore responsive high volume and high-volatility issues generally do not re-quire delays as long as slower-moving low volume and low-volatility issues.One could also use trend lines corresponding to the timeframe being applied totune the VPCI.

To ensure a broad scope within the sample being studied, the test was bro-ken into several elements. Securities were selected across three areas of capi-talization: small, as measured by the S&P Small Cap Index; medium, as mea-sured by the S&P 400 Mid Cap Index; and large, as measured by the S&P 100Large Cap Index. Equally important are the trading characteristics of eachsecurity. Thus, securities were further characterized by volume and volatility.Combining these seven traits forms a total of 12 groups: small cap high vol-ume, small cap low volume, small cap high volatility, small cap low-volatility,mid cap high volume, mid cap low volume, mid cap high volatility, mid caplow volatility, large cap high volume, large cap low volume, large cap highvolatility and large cap low volatility (Table 2).

Table 2. Sixty Securities Organized by Size, Volume andVolatility

Large Cap Large Cap Large Cap Large CapLow Volatility High Volatility High Volume Low VolumePG EP CSCO ATISO AES MSFT HETBUD DAL INTC BDKWFC ATI ORCL GDPEP NSM GE CPB

Mid Cap Mid Cap Mid Cap Mid CapLow Volatility High Volatility High Volume Low VolumeMDU ESI ATML WPOATG LTXX SNDK BDGWPS NDN COMS TECUAHE WIND MLMN KELYANFG SEPR CY CRS

Small Cap Small Cap Small Cap Small CapLow Volatility High Volatility High Volume Low VolumeUNS BRKT CYBX NPKUBSI ZIXI MOGN GMPCIMA MZ KLIC SXICTCO LENS HLIT SKYATO CRY YELL LAWS

To ensure unbiased results, five securities were back-tested in each of these12 subgroups for a total of 60 securities, a significant sample size. For cred-ibility, the five securities representing each group were not selected at random,but by identifying the leaders in the various characteristics being measured.Thus, the five highest- and lowest-volume securities, as well as the five high-est- and lowest-volatility securities of each of the three capitalization groups asidentified by Bloomberg (June 22, 2004) were used in the study. Any dupli-cated securities (high-volume and high-beta stocks were occasional duplicates)were used only once. Securities that lacked sufficient history were removedand replaced by the next-best suitable issue.

To keep the system objective, both long- and short-system generated tradeswere taken into account. A $10,000 position was taken with each crossover.

Commissions were not included. The testing period used was August 15, 1996,to June 22, 2004, for a total of 2,000 trading days. The results were measuredin terms of profitability, reliability and risk-adjusted return.

Profitability

Profitability was tested using a five- and 20-day moving average crossoverand then retested using only those trades also displaying VPCI confirmationsignals. The results were impressive (Figure 6). Broadly, the VPCI improvedprofitability in the three size classes - small, mid, and large caps - and all fourstyle classifications - high and low volume, and high and low volatility. Nine ofthe 12 subgroups showed improvement. The exceptions were mid cap high-volatility issues, and small and large low-volume issues. Of the 60 issues tested,39 or 65%, showed improved results using VPCI. The VPCI group made$381,089. This compares to the competing non-VPCI group making only$169,092. Thus, overall profitability was boosted by $211,997 with VPCI.

Figure 6. Profitability Improvement with VPCI

Reliability

Reliability was measured by looking at the percentage of profitable trades.By employing VPCI in the five-/20-day crossover system, overall profitabilityimproved an average of 3.21% per issue. Improvement was realized by addingVPCI in all three size groups and all four style groups (Figure 7). Of the 12subgroups, 10 showed improved profitability with the VPCI. Large and smallcap low-volatility issues were the two exceptions. Overall, over 71% (43 of 60issues) showed improvement with the VPCI.

Figure 7. Average Trading System Reliability

Risk-Adjusted Returns

Two tests of risk-adjusted performance were conducted to further evaluateVPCI. One was the Sharpe Ratio, which takes the total return subtracted fromthe risk-free rate of return (US Treasury Note) and divides the result by theportfolio’s monthly standard deviation, giving a risk-adjusted rate of return.VPCI improved the results once again across all three size categories and allfour style groups. VPCI realized improvement in nine of the 12 subgroups. Midcap high volatility, large cap low volatility, and large cap low volume were theexceptions. Overall, the Sharpe Ratio showed significant improvement withthe addition of the VPCI.

A second way to look at risk-adjusted returns is through profit factor (Fig-ure 8). Profit factor takes into account how much money could be gained forevery dollar lost within the same strategy, measuring risk by comparing theupside to the downside. It is calculated by dividing gross profits by gross losses.

Page 24: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

23JOURNAL of Technical Analysis • Summer-Fall 2004

For instance, one issue may generate $40,000 in losses and $50,000 in gainswhereas a second issue may generate $10,000 in losses and $20,000 in gains.Both issues generate a $10,000 net profit. However, an investor could expect tomake $1.25 for every dollar lost in the first system, while expecting to make $2for every dollar lost in the second system. The figures of $1.25 and $2 repre-sent the profit factor. Even more significant improvements across all size, vol-ume and volatility groups again were achieved using the VPCI. Of the 12 sub-groups, only large cap low-volatility issues did not show an improvement withthe VPCI. Overall, the profit factor was improved by 19%, meaning one couldexpect to earn 19% more profit for every dollar lost when applying VPCI to thetrading system.

Figure 8. Profit Factor Improvement Using VPCIAmong the 12 Subgroups

Other Applications The raw VPCI calculation may be used as a multiplier or divider in con-

junction with other indicators such as moving averages, momentum indicators,or price and volume data. For example, if an investor has a trailing stop lossorder set at the five-week moving average of the lows, one could divide thestop price by the VPCI calculation. This would lower the price stop when priceand volume are in confirmation, which would increase the probability of keep-ing an issue demonstrating price volume confirmation. However, when priceand volume are in contradiction, dividing the stop loss by the VPCI would raisethe stop price, preserving capital. Similarly, using VPCI as an add-on to vari-ous other price, volume, and momentum indicators may not only improve reli-ability but increase responsiveness as well.

ConclusionThe VPCI reconciles volume and price as determined by each of their pro-

portional weights. This information can be used to confirm or deny the likeli-hood of a current price trend continuing. This study clearly demonstrates thatadding the VPCI indicator to a trend-following system results in consistentlyimproved performance across all major areas measured by the study. As amaestro’s baton in the hands of a proficient investor, the Volume Price Confir-mation Indicator is a tool capable of substantially accelerating profits, reducingrisk and empowering the investor to more reliable investment decisions.

Footnotesi Ammer, C. (1997). The American Heritage Dictionary of Idioms. Boston:

Houghton Mifflin Company.

ii The American Heritage Stedman’s Medical Dictionary. (2002). Boston:Houghton Mifflin Company.

iii Edwards, R.D., & Magee, J. (1992). Technical Analysis of Stock Trends.Boston: John Magee Inc.

BIOGRAGPHY

Buff Dormeier, CMT began in the securities industry in 1993 withPaineWebber. From PaineWebber Buff joined Charles Schwab where hehandled some of the firm’s largest and most active accounts. Do to hisgrowing popularity with the firm’s clientele; Buff was called to train otherbrokers in the art of communicating technical market analytics to customers.His training program gave birth to Schwab’s Technical Analysis Team. Later,Buff became the lead portfolio manager and chief technical analyst at T.P.Donovan Investments. Armed with proprietary indicators and investmentprograms, Buff now coaches and manages portfolios for individual andinstitutional clients as a financial advisor at a major international brokeragefirm.

Buff’s work with market indicators and trading system design has beenboth published and referenced in Stock’s & Commodities and Active Tradermagazines & has been discussed in seminars across the nation. Further,Buff’s original contributions to the field have been included in JohnBollinger’s book, “Bollinger on Bollinger Bands.” Buff is Series 8, 7, 65,63 and insurance licensed and has previously served on the MarketTechnicians Association Admissions Committee. Personal hobbies includerunning and bible study.

Page 25: Journal of Technical Analysis (JOTA). Issue 62 (2004, Summer—Fall)

JOURNAL of Technical Analysis • Summer-Fall 200424

Notes

ates

noates

noates

noatesnoates

noates

noates

noates

noates

noates

noatesnoates

noates

noates