You are on page 1of 18

Introduction

0
Foreword: Forward: Toward the Information Age
"The hidden harmony is stronger than the apparent."[1] This was said by Heraclitus of Ephesus, in about
500 B.C., and the notion behind it is the framework upon which all the science and natural philosophy of
human civilisations is based. The writings of Heraclitus have survived only in fragments quoted or
paraphrased in the works of others. Many of these concise comments are of that general sort that can be
found relevant to any topic, and hence are simultaneously trivialities and far-reaching truths. In
philosophy and in mathematics, the simple truths are some of the most beautiful, and what is supposedly
mundane can be shown to be wonderfully, richly structured. Heraclitus recognised this with his pithy
comments, and it is with this recognition that the present discussion must begin.

Heraclitus saw the universe as a manifestation or embodiment of a consciousness, writing that "Wisdom
is one thing: to understand the thought which steers all things through all things." The goal of science,
broadly speaking, is to reduce the corpus of observed natural phenomena to a simple set of rules capable
of describing the workings of the universe, to see this fundamental One in the apparent many. The lure
of approaching such a deeper understanding of the universe is the driving force behind "pure science",
the pursuit of science for its own sake. Stephen Hawking, a contemporary physicist, has written that to
answer "the question of why it is that we and the universe exist" is to "know the mind of God". Charles
Babbage, who in the nineteenth century anticipated in an informal manner much of the discipline now
known as computer science, wrote that to understand his clockwork machines was to understand the
mind behind the universe. The Pythagoreans of the sixth century B.C. surrounded their mathematics
with an elaborate mysticism, and gave it divine attributes. There are almost as many other examples,
preserved for posterity or not, as there have been scientists. What leads so many thinkers independently
to a notion of approaching the Divine or the Perfect by doing science?

I am at work on this thesis because, having pursued computer science to a sufficient extent, I have come
to see that computer science and mathematics are fundamentally entwined with languages and
expression. At the foundation of both lies a fascination with encodings and representations, with what
one can and cannot convey or suggest using some scheme of representation. We shall see in this work
that language simultaneously facilitates and restricts expression. In both mathematics and literature, one
confines oneself to a formalism; one builds one's own prison in the form of syntax and semantics.
However, the alternative to the common context provided by such a prison is a lack of understanding,
mere babble. Thus, language is a necessary evil. But being imprisoned need not be so horrible, at least
not when one has control over the form that one's prison is to take. If there were no prisons, there would
be no hope of escape from prisons. If there were no bondage, there would be no escape artists. If there
were no mountains, there would be no successful mountain climbers. Without the difficulties of
expression, mathematics and literature would be nonexistent. One of the most eloquent expressions of
this feeling that I have ever come across is Wordsworth's sonnet entitled "The Sonnet":

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (1 of 18)29/4/2010 11:10:07


Introduction

Nuns fret not at their convent's narrow room,

And hermits are contented with their cells,

And students with their pensive citadels;

Maids at the wheel, the weaver at his loom,

Sit blithe and happy; Bees that soar for bloom,

High as the highest peak of Furness fells,

Will murmur by the hour in foxglove bells:

In truth the prison unto which we doom

Ourselves no prison is: and hence for me,

In sundry moods, 'twas pastime to be bound

Within the Sonnet's scanty plot of ground;

Pleased if some souls (for such there needs must be)

Who have felt the weight of too much liberty,

Should find brief solace there, as I have found.

Given all the constraints that have arisen in the syntax and semantics of the English language, or of any
language, through the accidents of history, why confine oneself even further to the restrictions in rhyme
and metre dictated by the sonnet structure? Haven't you enough fetters already? The response that
Wordsworth shoots back is a playful, "Why not?" If you're gonna do it, do it up.

The history of science is a series of instantiations of Heraclitus's observation. Probably the most well-
known example of a hidden harmony supplanting an apparent one is the victory of a heliocentric
cosmology over the more "obvious" geocentric one. To justify the placement of the earth at the centre of
the universe, all one need do is look up in the sky and see the sun, the planets, and the stars moving
around the earth upon which one is standing. It was apparent, and understandably so, to ancient humans
that the earth upon which they lived their lives was rock-solid, immovable, trustworthy. And it seems
natural that humans should occupy a special, distinguished place in the cosmos, for we know of no other
entities that are capable of contemplating the universe as we do. So putting the earth in the middle of

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (2 of 18)29/4/2010 11:10:07


Introduction

everything felt reasonable. The planets (from the Greek for "wanderer") were a bit out of line. As well as
rising and setting daily as the rest of the heavenly bodies did, they travelled in circuits across the sky.
Keener observation revealed the anomaly of the retrograde motion of the planets--they appeared to move
backwards along their tracks at intervals. In hindsight, we can say that this anomaly should have
occasioned a reëvaluation of geocentrism. But nobody feels the earth moving, and the immobility of the
earth was an implicit postulate, a tenet that it would have been absurd to question. It took Renaissance
observers and thinkers such as Galileo and Copernicus to set the record straight--or, more precisely, to
set it straighter than it had been previously. For heliocentrism was not the be-all and end-all of
cosmology. Johannes Kepler generalised the idea with his laws of planetary motion. And this left the
way open for Newton, who derived Kepler's laws from the more fundamental law of universal
gravitation. The simplification does not stop at this stage, but goes on to Einstein's theory of relativity
and to the aim of contemporary physicists to produce a so-called Grand Unified Theory.

Each stage of discovery in this chain represents the recognition of a more fundamental, more simple,
more beautiful harmony underlying the currently noted harmony. Science supposes that collections of
events display regularities, and that such regularities suggest rules by which the universe appears to
operate. These rules, though arrived at reasonably, can seem counterintuitive or opaque. Who would
believe that something as fundamental in human existence as the passage of time is not a universal
constant? Yet physics tells us that this is so. Who would believe that all the substances in our everyday
existence, with their wide spectrum of properties, are compositions of the same fundamental building
blocks? This is far-fetched. And physics tells us that it is true.

The physics of the nineteenth century revealed another hidden harmony, this time in the relationship
between mechanics and electromagnetism. Mechanics had long preceded electromagnetism historically;
mechanical tools have been in existence since prehistoric times, and mechanics, on earth and in the
heavens, was the object of study in Newtonian physics. Thus when the interrelationships and
dependencies between light, electricity, and magnetism were discovered, physicists naturally attempted
to explain them using mechanical models. It was implicitly assumed that scientific explanation
demanded a reduction to mechanical terms. Mechanical models for electromagnetic behaviour were
devised. Electric and magnetic forces were viewed as conditions of mechanical strain in the
electromagnetic æther, an elastic solid that was supposed to permeate space and to have no interaction
with ordinary matter except through its unique, electromagnetic properties. James Clerk Maxwell
developed a model of electromagnetism in which magnetic field lines were rotating tubes of
incompressible fluid. These tubes were separated by particles that rotated in the direction opposite to
that of the tubes and acted as gear wheels. A flow of electrical current was equivalent to a flow of these
gear wheel particles between the tubes. That such elaborate mechanical models were designed in order
to account for electromagnetism shows how engrained mechanism was in classical physics. Eventually,
the work of Maxwell and others during the second half of the nineteenth century helped to dispense with
ideas of æthers and cogs, and to explain the mechanical nature of matter in terms of its more
fundamental electromagnetic properties. As had happened in cosmology with the transposition of the
earth and the sun, a great inversion had occurred: An assumption had been questioned, and an entire
science had been turned around.

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (3 of 18)29/4/2010 11:10:07


Introduction

Heraclitus believed that existence depends fundamentally upon opposition. "All things come into being
through opposition, and all are in flux like a river." The composition and properties of all objects change
continously. As I am typing this sentence at a computer keyboard, for example, the keyboard remains a
keyboard, and yet the action of my fingers upon it is constantly removing molecules from the surfaces of
the keys, and altering them subtlely. "Upon those who step into the same rivers flow other and yet other
waters." So what we call a river, although macroscopically it always seems the same, is actually a
constant flux. Heraclitus probably chose the particular example of a river in order to emphasise the
fluidity of Nature. Although Heraclitus seems correct with his observation of a universal flux and
transitoriness, this does not necessarily mean that identity is an illusion. I type on the computer
keyboard, and it does change on a microscopic level, but it remains a computer keyboard. After many
years of typing, the keys may become so eroded that the keyboard ceases to be a keyboard, but that
would be an extreme case. With Heraclitus's example of the river, the waters that form the river are
never the same, yet the structure of the river persists. It is an abstraction, a higher-level form than the
water molecules that move through it. This view can be taken with many objects, whose constituents
change but whose forms, which are their essential characteristics, persist: the data stored in an electronic
computer, set up by movements of electrons; a human body, composed of simple elements arranged into
complex organic molecules, with old cells dying and new cells being created; a city, with millions of
inhabitants entering and leaving and thousands of individual buildings being constructed and razed that
together form its populace and skyline. Such abstractions of form are basic to all science, and to
computer science in particular. Particular objects can be grouped into general categories, and, given such
groupings, general methods of description and manipulation can be applied to entire groups, such as "all
right triangles", or "all monosubstituted benzene rings", or "all inertial reference frames".

Like most doctrines, Heracliteanism can be taken to unreasonable extremes. There was one Heraclitean
who claimed that true description is an impossibility, since it takes time to generate a description, and by
the time such a description has been constructed, the object that one has been trying to describe has been
changed, its original identity swept away in the flux. It is as if one were trying to communicate with a
passenger in a passing race car: as soon as the car is upon one and one begins to hail it, it has sped by,
and the moment has been lost. This Heraclitean refused to utter descriptions and was convinced that the
only possible way to identify objects was to point at them, rapidly and wordlessly. Plato said
disparagingly that the Heracliteans of his own time "take very great care to see that nothing gets
settled."[2] Indeed, when one is faced with the impermanence and apparent complexity of Nature, there
is a temptation to abandon all hope of treating it with reason and to adopt complete mysticism. I often
think of the many times I have wanted to give up when trying to sort out a hopelessly tangled problem in
computer science. Yet, somehow, perhaps after much "wasted" effort and misdirection, problems are
resolved and things are set in order. Historically, science and natural philosophy have always included
both reason and mysticism, in extremes and in combinations. Thus there has been a considerable overlap
between scientists and theologians, and, especially in this century, we find many popular works by
scientists talking about the philosophy of science and the mind of God. It seems inconsistent with the
work of Heraclitus to follow Plato in viewing him as an extremist mystic. Even while Heraclitus was
living, there was certainly no love lost between him and his fellow Greeks, and he would probably have
had just as dark a view of his devotees as Plato did. Heraclitus's book, from which his fragmentary
quotations and paraphrases have come down to us indirectly, is said to have been deposited by him in

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (4 of 18)29/4/2010 11:10:07


Introduction

the great temple of Artemis in Ephesus, where the general public would not have had access to it.

Heraclitus described his universal flux as "kindling by measure and going out by measure". For him, this
was in a way more than a metaphor. In tandem with the fundamental unity of the laws by which the
universe acts, Heraclitus proposed a unity of all the matter in the universe. His world-order was
composed of only one substance, fire, in various states of compression or expansion. This doctrine of
one and only one elementary building material, "material monism", foreshadowed the discoveries of
modern physics that all matter consists of elementary particles/waves. What gives macroscopic matter
its particular properties is the combinations and arrangements in which elementary building blocks
occur. Thus, information lies not in the building blocks themselves, but in their groupings and
arrangements. Just as with Heraclitus's river, the form in which a thing exists is more important than the
substance of which it is formed. Any contemporary student of chemistry learns the importance of this
idea when they learn about isomerism. Texts, such as works of literature, technical documents, and
computer programs, have the same property. They are all composed from the same set of symbols. It is
remarkable, and typical, that this idea of a simple, basic substance of the universe arose in an almost
complete absence of empirical evidence to support it. There was no science of chemistry. All there was
in the way of empirical evidence was the common knowledge that some substances under certain
conditions could be transformed into other substances. For example, wood in the presence of heat would
become fire and ash. Since so many substances burn and since fire was essential for everyday life, it was
natural to attribute some special role to fire. At least as important as these observations was an intuition
that the universe cannot be so complex as to allow every substance to be fundamental and irreducible.
The suggestion of a fundamentally complex universe feels uncomfortable to scientists and natural
philosophers, just as an income tax form with seven supplementary schedules, twenty pages of
instructions, and several exceptions and special cases feels more uncomfortable than a simple one-page
return with instructions on the back. The running of the universe should not be so bureaucratic.

A "fuzziness" to the universe, the transitoriness and uncertainty behind the idea of Heraclitean flux, was
recognised in the eighteenth and nineteenth centuries in the study of probability, and more recently in
computer science by many approximations and heuristics that have arisen in the study of artificial
intelligence. Uncertainty and contradiction seem inherent in the universe. This is a theme to which we
shall return, one that has great significance in the context of the mathematics of the early twentieth
century. Some scientists and philosophers abhor contradiction and try not to deal with it; others accept it,
and even delight in it. Students of Zen Buddhism, for example, try (while not trying) to attain
Enlightenment, a complete intuitive understanding and unity in the universe, by stopping thought,
stopping logical analysis. Robert Pirsig, in his book Zen and the Art of Motorcycle Maintenance,
describes the shortcoming of analysis in terms of an artificial division of subject and object induced by
its use. Even the use of language causes this division, so that to talk about something, or even to think in
language about something, is to destroy with what Pirsig terms "the analytical knife". (The sentences
about Zen that you are reading now cannot truly be about Zen, because they are constructions of
language.) Thus, Zen is shrouded in a cloud of mysticism, and scoffed at by many westerners. Heraclitus
would have appreciated Zen, although he would have denigrated most of those who profess to be
students of Zen. Indeed, if the people of ancient Greece were exchanged with people of the
contemporary world, we would probably find many self-professed Heracliteans wrapping themselves in

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (5 of 18)29/4/2010 11:10:07


Introduction

the obstinate mysticisms offered by our own age, taking up "New Age" mysticism, reading books about
purported ancient astronauts, and making pilgrimages to experience "harmonic convergences". Is it any
wonder that Heraclitus regarded his fellow citizens with contempt?

Alfred North Whitehead wrote that all of modern western philosophy can be viewed as a series of
footnotes to Plato. This is not so much an accolade for Plato as it is a statement of the universality and
inevitability of much of Plato's thought. It didn't have to be Plato; he only happened to be in the right
place at the right time. The particular identity of Heraclitus is similarly irrelevant to the thrust of much
of his thinking. Like most of the work of individual scientists, his ideas were, and still are, "in the air".

Dictionary of the Khazars, by Yugoslavian author Milorad Pavic, is a novel that abandons the
traditional, linear form of narrative in favour of the structure of a lexicon. Thus the reader is not
constrained to begin with the first entry, proceed to the second, and so on, but may pick any ordering;
the sequencing of the individual entries of the dictionary is arbitrary. In the English translation of the
novel, they are alphabetised according to the English translations of their key words, thus giving them a
different ordering than they might have in the original Serbo-Croatian. References from one entry to
another may be forward or backward, depending upon the order that the reader has selected. Thus Pavic
has broken out of the prison of linear narrative. Dictionary of the Khazars contains a fascinating
character named Judah Halevi, who displays Heraclitean thinking:

"There is only one wisdom," Halevi was later to write; "the wisdom spread through the sphere of the
universe is no greater than the wisdom contained in the tiniest of animals. Except the former--composed
of pure matter, which is constant and hence diverse in kind--can be destroyed only by the Creator who
made it, whereas animals are made of matter that is subject to various kinds of influence, and so the
wisdom in them is subject to heat, cold, and everything else that affects their nature."[3]

The analytical style of this last sentence is interesting. One could imagine lifting it, with its lengthy
qualifications and glosses, from some ancient Greek philosophical treatise. Or perhaps from a more
formal version of Lucky's speech in Waiting for Godot. But leaving aside the sadly incomplete studies of
Fartov and Belcher, compare Heraclitus's thoughts, particularly his fragment that begins "Wisdom is one
thing", his view of individuals as collections of properties and their consequent evanescence, his holism.
Halevi tells of the incompleteness of analysis:

In Arabic he studied philosophy, which was under the influence of the ancient Greeks, and about which
he wrote, "It has colors but no fruits, and while feeding the mind it gives nothing to the emotions."
Hence, Halevi believed that no philosopher can ever become a prophet.[4]

The character of Halevi, and indeed the entire novel, is suffused with dualities, and with the sense that
the balances of these dualities are irrelevant. The novel exists in two editions, one male, one female. But
they differ only in one paragraph, which details only the thoughts, not any actions, of one character.
(The two versions of this paragraph are recognisably male and female, respectively--but only when
compared with each other.) Thus interpretations differ between the two editions, but not the underlying

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (6 of 18)29/4/2010 11:10:07


Introduction

facts upon which those interpretations are grounded. Objectively--in terms of plot, if "plot" can be taken
as a valid term in discussing such a work--the two editions are identical. Halevi's friends tell him that
"He who takes a bite in his mouth will not be able to say his name; he who says his name will make the
bite in his mouth bitter." Pavic tells of the illusory nature of names, and of nouns in general. When one
has arrived at an intuitive understanding, that understanding will necessarily be nonverbalisable. And,
correspondingly, when one uses a noun, one is creating the artifice of subject and object, destroying the
true nature. The internalising metaphor of taking a bite in one's mouth is remeniscent of the Buddhist
simile that compares the attaining of Enlightenment to the sea entering the drop of water.

To all humans, time is special, set apart. To say "I" or "myself" is to refer to oneself at some particular
time, and hence to wield the analytical knife. It seems strange to say "I", for Matthew Belmonte at this
particular moment is very diferent from the Matthew Belmonte of ten years ago, or even of one year
ago. If one must preserve a notion of self, it would be less painful, I think, to think of the self not as a
dynamic entity existing at a particular time in the dimensions of space, but as a static entity existing in
the dimensions of space and time, a worm-like figure snaking through four dimensions. In his novel
Slaughterhouse-Five, Kurt Vonnegut attempts to portray the world-view of a race of beings, the
Tralfamadorians, to whom time is just another dimension. To them existence is a large, four-dimensional
thing. (Not an object, a thing.) It is not perceived or experienced, because perception and experience are
processes, and process implies the passage of time. But time, according to the Tralfamadorians, does not
pass; it simply is. Slaughterhouse-Five cannot succeed in its portrayal, because it is written, and as a
piece of writing it is read by humans, and reading is a process. The passage of time and the savage
analytical slice of subject and object are built into languages. No description is possible without some
kind of motion, even if that motion is only an arbitrary one from feature to feature of some static object.
To say anything is to fail, and the only communication possible between the author and the reader is not
a conveyance of knowledge, but one of pale suggestions, triggers. This observation that true knowledge
cannot be conveyed by language was used by Socrates to corral his interlocutor by reasoning from the
interlocutor's own statements and definitions, which necessarily came packaged with the seeds of their
own negations. In the opening of Plato's Parmenides, for example, the ancient Greek philosopher Zeno
of Elea discusses how both he and his opponents were able to deconstruct each other's arguments on
cosmology by extracting absurd conclusions from premises inherent in the arguments.

I cannot help but feel, in the absence of experience, that my grounding in the western tradition of
analysis may have closed me off from the intuition about the universe that Zen practitioners are said to
feel. Things appear alternately wonderful and terrible. Already there have been occasions during the
writing of this thesis when I have felt that nothing could save me; the text on the page is a cheat; I wish
to produce the master portrait, but as soon as I touch my fingers to the keys I can see that I am crafting a
messy kindergarten sketch. And then there are times when I feel myself grooving; the ideas in my head
are in fruitful collision and I can hardly type out notes fast enough. Virginia Woolf delves deeply into
such dualities of perception in her novel To the Lighthouse, a statement of the feelings of people who are
simply living and trying to be happy. The character Lily Briscoe, a vehicle for Woolf's own sentiments
on art and creation, seesaws between grooving and depression as she tries to capture essences in her
painting:

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (7 of 18)29/4/2010 11:10:07


Introduction

What was the problem then? She must try to get hold of something that evaded her. It evaded her when
she thought of Mrs. Ramsay; it evaded her now when she thought of her picture. Phrases came. Visions
came. Beautiful pictures. Beautiful phrases. But what she wished to get hold of was that very jar on the
nerves, the thing itself before it has been made anything. Get that and start afresh; get that and start
afresh; she said desperately, pitching herself firmly again before her easel. It was a miserable machine,
an inefficient machine, she thought, the human apparatus for painting or for feeling; it always broke
down at the critical moment; heroically, one must force it on.[5]

To be able to create, one must accept the presence of imperfection in one's work. Beautiful isolated
phrases and intuitions come to me during the day, but when I sit down to write I feel I am killing them
by trying to hammer them into a piece of writing. I can't find the notes I'm looking for on my desk,
books are out of order in my library, it all seems a useless travesty. But it's the best I can do, so
eventually I press on. What I am striving to approach is "the thing itself before it has been made
anything", precognition, that elusive unity that exists before perception sets in and the thing is made an
object of thought.

***

Here is the attitude that you are liable to encounter, and probably already have encountered, from most
computer scientists and computer professionals in your study of computer science:

Is humankind on the verge of a leap as great as the advent of language?

The great intellectual achievements of the past hundred years have been in the physical sciences, says
John E. Hopcroft, chairman of Cornell's Department of Computer Science and 1986 winner of
computing's highest honor, the Turing Award. He points to our understanding of Newtonian physics and
quantum mechanics, of the creation of the universe, and of time, space, and relativity.

"In the next hundred years the real intellectual achievements of humankind are going to occur in two
areas: biological sciences and something that's coming to be called informational sciences," the
computer scientist predicts.

Human evolution occurred when language, art, and culture first separated our species from the others.
That was followed by the agricultural revolution and the industrial revolution.

"I believe we're right at the beginning of another--the information revolution. The changes due to
computers, whose capabilities we can barely imagine, will be as profound as the changes due to
language.

"Imagine a kind of `concept encyclopedia' that can be restructured automatically to present information
in any form you want. People will discover completely new ways of using libraries. Rather than
accessing information by key words, we'll be able to access it by concepts.

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (8 of 18)29/4/2010 11:10:07


Introduction

"Clearly there's a tremendous jump between what we can do with computers today and what we know
can be done," Hopcroft acknowledges. "But look how far we've come since the first computer in 1945.
That seems like a revolution. But it's just the beginning. Now every three years or so we gain an order of
magnitude in additional computing power. Twenty years from now, thirty years from now--you'll be
very surprised."[6]

These people believe that humanity is on the verge of a revolution in information accessibility, that our
ability to organise information will soon pull ahead in its race to keep up with the accelerating expansion
of the amount of published information, that someday soon we will have a Star Trek world, where the
library computer will be a much more efficient replacement for the reference librarian. They are excited
by advances in computer technology. Indeed, this belief in the dawning of the Information Age may be
correct. But even so, information accessibility will be limited by the capacity of the human brain to
acquire and to preserve knowledge. It is unlikely that this limitation of the capacity of the human brain
will be overcome technologically at any time in the near future. In February 1989, Steve Jobs, president
of the NeXT Corporation, a computer vendor, visited Cornell to introduce his NeXT computer. The
Cornell Daily Sun said of the visit:

But laughs turned into gasps as an image of... spinning molecules, drifting clouds and the NeXT logo,
grew to fill [the] screen from top to bottom, accompanied by Aaron Copeland's "Fanfare to the Common
Man." The demonstration, including the digital sound, was generated entirely by the NeXT computer.

Jobs then proceeded to detail the hardware inside the machine, which he said was taken from a wish-list
composed by university advisors around the country. "We got very lucky in that we found 24 schools,
Cornell one of them, that helped us define what the computer was," he said. "If it didn't embody what we
thought computers were going to be like in the 1990s, we'd be idiots."

The list of features on the NeXT reads like a hacker's fantasy. Its main processing board uses the state of
the art Motorola 68030 chip. The monitor is a high-resolution, 17-inch display. It comes standard with 8
megabytes of memory, transparent networking...[7]

...and so on. The author of this newspaper article is a technological Pollyanna. He believes that the
enthusiasm of his article is in keeping with the spirit of objective reporting. After all, what could be bad
about a new way to generate and to disseminate information? NeXT couldn't have done any better if it
had purchased a full-page advertisement in the Sun. And the author likely is unaware of his prejudice.

One of the objectives of this book is to temper this rampant "What, me worry?" attitude with an
alternative view. In our study of computer science, we will see that mathematics itself led in the early
twentieth century to what can be viewed as a logical refutation of positivism. Thus, science itself tells us
that we must look beyond the bounds of purely scientific thought. In order to appreciate what you will
learn in computer science, you must also cultivate an understanding of the human civilisation that
discovers and expresses that science.

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (9 of 18)29/4/2010 11:10:07


Introduction

History is largely ignored in today's scientific education. Young mathematicians learn logic and
analysis, but they are told nothing of the foundations of their goal in the work of thinkers from Plato and
Aristotle to Condillac. Contemporary science is a culture deracinated, and its students are for the most
part left to struggle and founder in ignorance, and perhaps haphazardly to stumble upon the roots of their
disciplines, the justifications of etymology. You, reader, should have a healthy doubt of the assertions
that I am setting forth, for I am a product of this one-dimensional educational system.

The situation is worst, in terms of this particular deficiency, in the United States. One author has
observed:

The entire education of American scientists accentuates their social deficiency. While we laud the social
sciences and insist that those who pass through our colleges should have a thorough grounding in them,
we make education so specialized that it becomes increasingly difficult for students to include in their
crowded curricula any of the social sciences which prepare for intelligent citizenship. For a scientific
student to become at all well grounded in the social sciences is quite impossible. Even in the years of
prespecialization the required courses which prepare for the specialization that is to follow are so
numerous, and the electives so few, that the preliminary so-called "general" education gives few
opportunities to the student to prepare himself for living a life of social utility.

Any pursuit of culture deters this process of worker-bee specialization and is frowned upon by modern
systems of advanced education. ...Curiously enough this almost psychopathic urge to be highly
specialized pursues the scientist into after life. He imposes specialization on himself most ritualistically;
he deprecates broad, general knowledge. Younger men are made to feel that any consideration of social
problems, or even any cultural interests, are to be frowned upon and discouraged, while a curiousity
which leads an investigator into the alluring pastures of general information is looked at askance as
diverting him from his specialty. Hundreds of scientists, perhaps thousands, actually pride themselves
upon the fact that they read absolutely nothing except the voluminous and repelling technical literature
which to-day accumulates with dire rapidity about the most narrow and unimportant specialty.[8]

This critic, T. Swann Harding, was writing in 1931, in the midst of what may be termed a revolution in
mathematics and just at the beginnings of a rapid expansion of physics and of the "pulp era" of science
fiction. This was before World War II and before Sputnik. Premature specialisation, infatuation with
technology, and the accompanying philistinism have been monotonically increasing in the intervening
years. Harding's revulsion to the piling up of arcane literature in the twentieth century may seem an
overreaction; by the beginning of the twentieth century, scientific fields had become so broad that it
became necessary to confine oneself to a small speciality in order to have a chance to gain depth of
knowledge about one's object of study. Without specialisation, scientific progress would have begun to
bottom out. But I sympathise with Harding's feeling.

So, is this shearing away of science from history and philosophy a villainy? Or is it, in its full extent, a
necessary and proper action of the Information Age? I take the moderate position. It has become
necessary, but it has gone too far and is occurring too early in the educational process. There was a time,

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (10 of 18)29/4/2010 11:10:08


Introduction

five or six centuries ago, when it was possible for one scholar to read everything that had been
published, in every subject. The Renaissance Man could afford to be so multi-faceted because he lived
in a relative information vacuum; such people were able to create amounts of knowledge that were
significant when compared with all the rest of the common learning of humanity. That age is long gone,
and there is no going back. We despair when faced with the daunting bulk of contemporary publishing:
The library may have hundreds of titles in my area of interest; I can't possibly read all of them, or even a
majority of them. All my effort may be for nought: I may be duplicating work already published; I may
be doing original work, but my book will be one in that sea of hundreds. What's it all for?

Yet if some cataclysm were to destroy all our libraries, universities, and other centres of research and
send us back to those good old pioneering days of William Caxton and the first printing presses, we
would call it a terror. Intellectual murder. Well which is it to be, then? Choose total knowledge or total
ignorance; either is the same. It seems we can never be satisfied. This so-called Information Age
demonstrates the inherently self-subverting nature of the road that our civilisation is taking. Heraclitus
was the earliest western philosopher known to us who enunciated the pervasiveness of opposites in
nature. The world is a place of contrasts, and we differentiate one shade of grey from another only
because one of them is the more black and the other is the more white. I can say that this particular book
is better than that particular book because I note that one of them is of higher quality than the other.
Similarly, when I judge a book by itself, I am really comparing it against many of the other books that I
have read. All judgement proceeds from comparison. Heraclitus's book survives only in quoted and
paraphrased fragments, but it must have been a masterpiece, possessing that subjective and elusive thing
that Pirsig calls "Quality" (with a capital `Q'). Think of your favourite book (or think of any good book
if you can't distinguish a favourite). And compare it to a book of precisely five hundred pages, with forty
lines per page, each of these lines comprising a row of fifty `a's, set uniformly in plain Helvetica type.
Which seems the more interesting? Let's forget for now the arbitrariness of languages. (For, of course,
there exists some language in which a string of one million `a's arranged in the manner specified has a
profound and earth-shattering meaning, but the problem of finding the definition of that language and
the problem of finding that profound, earth-shattering truth are equivalent.) Obviously (and whenever
we use this adverb we are in grave danger of muddying the analytical waters), a classic such as
Heraclitus's treatise is better than the aforementioned monotonous string of `a's. But what we mean when
we say "better than" in this context is up in the air.

Suppose I have a library containing exactly one book. Then the problem of locating any given book in
the library, or ascertaining that it is not in the collection, is trivial. One compares a given title or author
or subject to the known title or author or subject, and returns the answer. Now suppose I add a few
books to my library. I can support this addition with no change to the general method that I use for
locating books within my library. If I have, say, ten or twenty books, then I can glance at each of them
and determine what subset of them fits a given request for subject, author, or title. But suppose I have a
hundred books. Things become more tedious, more laborious. Each time I wish to retrieve a book I am
faced with searching through piles of unordered books in order to find what I want. So I decide to order
them. I have a fairly good memory of which authors have written on what subjects, and therefore if I
have an efficient way of finding titles by a particular author, I can search fairly efficiently for writings
on a given subject. So I decide to place all my books in lexicographic order of the (primary) author's

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (11 of 18)29/4/2010 11:10:08


Introduction

surname. I do this, and it works splendidly. I delight in the ease with which I can find what I am looking
for. Information is at my fingertips, and it seems that I have solved my problem. But I acquire more
books, and again my system breaks down, and again I need a new one. As geocentrism succumbed to
heliocentrism and mechanism to electromagnetism, simple lexicographic ordering must be supplanted
by a higher harmony. Now I have thousands of titles, and I can no longer remember everything about
who wrote about what. I need a subject classification system. So I build a card catalogue. Now I can
access any book by title, author, or subject. The canonical phrase with which I choose to represent the
subject is unavoidably arbitrary, but I will happen upon it if I look hard enough. But I am distressed. I
have spent all this effort in building up my library so that I could have more information ready at my
fingertips. But now I find that each individual volume has been made more difficult to retrieve by the
addition of the other volumes. So in a sense my system subverts itself: the more information I gather, the
more difficult it becomes to access any particular piece of information. Eventually I have added so many
more books that my simple card catalogue with its narrow subject classifications no longer suffices. I
will have to computerise my reference system, and go back through every card in my catalogue and add
more cross-references. Where does it stop? I am in a mad rat-race, my taxonomy perpetually straining to
generalise itself to keep up with the bulk of information that must fit within its scope. I have a terrible
apprehension that it can't last, it can't go on like this; something will break that I won't be able to patch.
And then the practically limitless stacks of books will be lost in a sea of entropy, useless. I will be able
to find nothing. And I will have completed the ultimate: I will have come full circle.

The Library of Congress is the greatest library. Ever. It currently occupies three buildings on Capitol
Hill, each the size of a city block, and several annexes, and it is growing. Popular notion has it that the
Library of Congress has one copy of every book ever published, or at least every book ever published
within the United States. This is wrong. The sheer bulk of all those books precludes retaining them all.
The Library receives 27,000 items each working day--ten million items per year. Most of these are
thrown out. Much effort is spent in deciding which of the new receipts can be disposed of and which
should be retained. The late science fiction writer Theodore Sturgeon popularised a saying known
among science fiction afficionados as Sturgeon's Rule. The folklore has it that Sturgeon's editor was
lamenting the state of science fiction and complained that ninety percent of it was "crap". Sturgeon's
reply is that this is to be expected since ninety percent of everything is crap. But how does one decide
what is crap and what is not? Or, relatively, which texts are more valuable and which are less valuable?

Computer scientists have an odd way of looking at algorithms using a strategy that they call off-line
computation. The paradigm is that we are given a complete series of requests, and once we have the
entire series, we must come as close as we can to processing it in an optimal manner. An optimal
strategy is one that requires the least effort. Effort is a compound of the time used in running
computations and the space used in storing data. The computation in our example is the work done in
cataloging and referencing titles, and the data are the books (and other records) themselves. The day-to-
day operation of a library is an on-line computation: the library is given a series of reference requests
from readers, but it does not know in advance precisely what requests it will be receiving. The optimal
strategy for running a library is an off-line algorithm: find out what all the reference requests will be,
and then acquire and dispose of books so that every request will be satisfied. That is, acquire a book just
before the first request for it occurs, and throw it away just after the last request for it occurs.

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (12 of 18)29/4/2010 11:10:08


Introduction

Of course in the real world one has to respond to requests in real time, before one has been given the
complete series of requests, so the optimality of an off-line algorithm cannot be attained (unless we can
see into the future). But it can be approximated. Strategies for planning acquisitions in libraries are
similar to memory management schemes in computer operating systems. Both manage a limited storage
space by trying to predict future patterns of requests from the patterns of the past. Librarians ask
themselves what titles, authors, and subjects are most likely to be referenced in the future. They try to
find patterns in current reference requests and to extrapolate from those into the future. But this
approximation is highly inefficient. Books are acquired that are never referenced. We maintain much
more data than we ever use, and we are far from optimality. The problem cannot be conquered, and as
the collections grow the libraries' classification schemes become overloaded. In this Information Age
there are those who warn of the doom of information overload. We may be drowned in the ninety
percent that is redundant, insipid, or otherwise not worthwhile and never be able to find the ten percent
that is worthwhile. In their 1982 exposition The Library of Congress, Charles A. Goodrum and Helen
W. Dalrymple summarise this pessimism:

Finally, we heard from a few [employees and users of the Library of Congress] who in effect were
challenging the assumption of progress. They noted that the Library of Congress was built on the given
that more knowledge results in better solutions. More data spurs innovation. Once people know
everything about everything, there is nothing that cannot be made better for man. These commentators
worried that there is not enough proof that all the knowledge and experience stored in the Library of
Congress has made enough difference to the American society to justify the resources it has absorbed.
When pressed as to what device is likely to be better or what damage has been done by the solution thus
far tried, they were unwilling to abandon the concept or the institution but fretted that such a massive
investment of time and treasure does not seem to have had a comparable impact. People expressing this
concern offered little advice on what could be done about their distress, but the frustration was expressed
often enough that it appeared to deserve mention.[9]

(You can find Goodrum's and Dalrymple's book about the Library of Congress housed in the John
Adams building of the Library of Congress, accessible by call number Z733.U6G67 1982.) This tacit
assumption that more knowledge and more analysis is better is at the foundation of western civilisation,
and especially the modern western democracies. The first major acquisition of the Library of Congress
was the library of Thomas Jefferson, and it has been growing ever since. But we have already noted that
the effort to produce and to collect more and more knowledge is self-subverting, moving from an
information vacuum to information entropy.

In his short story "La Biblioteca de Babel", Jorge Luis Borges examines the problem of entropy in the
quest for knowledge. "La Biblioteca de Babel" is a history of a bibliothecal universe in the form of an
apology penned by a man at the end of his life. Its retrospective narration and tight, mathematical style
are characteristic of Borges's fiction. The universe of this story is a regular, repeating structure whose
unit cells are hexagonal chambers lined with shelving containing a precise number of books, each of
which is in a precise format, uniform in orthography and in number of pages, lines per page, and
characters per line. The universe contains all possible books--that is, every possible permutation of

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (13 of 18)29/4/2010 11:10:08


Introduction

characters on pages in books is actualised by some volume somewhere in the library. This is the
extremum of the current rapid growth in the amount of published information and the accompanying
entropy in libraries. When we have written out everything, in all possible languages, we will have
attained this state of complete entropy and uselessness, and all the research and writing ever done will
have become superfluous. Borges accentuates the ludicrous impression that the existence of such a
universe makes on us by projecting classical cosmologies onto it. In his narrator's speculations on the
large-scale structure of the universe one hears echoes of the systems of the world-order set forth by the
pre-Socratic philosophers. At another point, the projection of the Argument from Design onto this
implausible universe has the character of a parody:

El hombre, el imperfecto bibliotecario, puede ser obra del azar o de los demiurgos malévolos; el
universo, con su elegante dotación de anaqueles, de tomos enigmáticos, de indefatigables escaleras para
el viajero y de latrinas para el bibliotecario sentado, sólo puede ser obra de un dios. Para percibir la
distancia que hay entre lo divino y la humano, basta comparar estos rudos símbolos trémulos que mi
falible mano garabatea en la tapa de un libro, con las letras orgánicas del interior: puntuales, delicadas,
négrisimas, inimitablemente simétricas.[10]

The implicit contrast with the historical Argument from Design is that in the universe of "La Biblioteca
de Babel" humans are considered imperfections, blemishes upon the world-order. What is considered the
marvelous work of God in this universe? The wonders of human anatomy and physiology? The eye?
The hand? No. Latrines. Borges deadpans the parody to its utmost.

The self-hatred displayed by this narrative pervades the narrator's culture, a world driven mad when
forcibly, constantly confronted with the pointlessness of creation. The narrator goes on to describe its
masochistic history--the initial surge of happiness when the philosophers proclaimed that the universe
contained all possible books--the state of ultimate fulfillment of the quest for knowledge--and the
following despair and nihilism when people realised that having everything was equivalent to having
nothing. "Hablar es incurrir en tautologías"--to speak is to fall into tautology, writes the narrator, for
anything he writes, including his current apology, is already recorded somewhere. When retrieving
information becomes more difficult than reproducing it, we may as well burn the libraries.

The idea of the universal library is by no means original with Borges; his novelty lies primarily in his
artful crafting of a universe based upon the idea. The nineteenth-century German physicist, philosopher,
and author Kurd Lasswitz seems to have been one of the pioneers of the idea with his short story "Die
Universalbibliotek". Lasswitz has been called a founder of German science fiction, and this designation
seems appropriate, for "Die Universalbibliotek" is typical of science fiction in several aspects. The only
story in this story is the progress of the dialogue between the characters, a plain-vanilla gathering of a
professor, a magazine editor (we see nothing of his magazine), and the professor's family. It would more
properly be called a philosophical dialogue. The story serves as a cheap frame for the exposition and
relies fully on the novelty of that exposition to carry it. This lack of subtlety and device is often seen in
science fiction, because most science fiction writers are fascinated primarily with science and
technology. The "fiction" in "science fiction" is often incidental, a convenient capsule for a new (or a

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (14 of 18)29/4/2010 11:10:08


Introduction

rehash of an old) scientific idea. It is notable that "Die Universalbibliotek" was not translated into
English for quite some time--not, in fact, until its historical significance in the development of the genre
of science fiction eclipsed its very minor literary merit. Thus it could not have had much influence on
fiction written in English. Yet, relentlessly, the problem of organising and accessing the rapidly
expanding body of human knowledge became a standard theme in science fiction. The explosion of
knowledge in the twentieth century begged its discussion. The idea is capable of many other
expressions, as with the computer scientist who claims to have recorded all possible knowledge by
whiting out a blackboard with chalk, or the sculptor who claims not to have created a figure from a
block of stone, but to have freed a figure that was already present in the stone. In all these cases, the
substance--words, chalk, or stone--is already present, and the problem lies in breaking the homogeneity
of the substance, assigning meaning to it.

In June 1988, I produced a technical report entitled "A Practical Attribute Grammar Circularity Test".
The specific substance of that paper is esoteric and not relevant to this discussion. During the previous
six months I had been experimenting with some optimisations to a computer algorithm, and I had come
up with some original ways of tuning the algorithm to increase its efficiency. My advisor encouraged me
to submit it to a journal, and I did. As a naïve student submitting a paper to a prestigious journal, I was
skeptical of my chances of acceptance. My expectation was fulfilled; the paper was rejected. The tones
of the referees' reports ranged from simply damning to hesitatingly unfavourable. What had I done
wrong? There were of course matters of structure in which my naïveté had shown through. But another
major fault was my coming about three years too late. I had duplicated ideas that had been presented
earlier. In 1985, a Soviet computer scientist had published (in Russian) a paper exactly stating my
primary optimisation. This was picked up by researchers in France at l'Institut National de Recherche en
Informatique et en Automatique (INRIA), who published a description of it (in English) in one of their
technical reports in April 1988. At that time I had recently discovered the optimisation and was in the
process of implementing it and testing its effects. Meanwhile, it was already in the process of finding its
way into American computer science, from the French, who had got it from a Soviet. The Soviet
happened to come up with it first, in 1985. But if he had not, the French would have; they were on the
same track. And if they had not, perhaps a revised form of my paper would have been publishable. As it
happened, my paper was almost completely superfluous, constituting a small part of that bulk of
unnecessary publication that is leading us toward the chaos of the Universal Library. Borges wrote that
in the universal library, to speak is to fall into tautology, because somewhere in the library there is a
book containing the transcript of one's speech. The human analogue, to the pessimist, is that to live is to
risk falling into superfluity, because there may already be someone else somewhere anticipating one's
thought. Perhaps there is someone even now at work on a book like this, that weaves computer science,
literature, and philosophy together in a presentation specifically aimed at talented secondary school
students.

How could I have avoided this pitfall of duplication? The information that I discovered independently
was contained in a technical report from a prominent French research institute. If I had been working in
France I would have seen that report. But I limited myself to the references available in Cornell's library.
A university library cannot feasibly collect and preserve all technical reports from all departments of all
universities in the United States, much less from foreign countries. There are too many of them, too

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (15 of 18)29/4/2010 11:10:08


Introduction

much bulk. If I had pursued all the references I would have been led to order a copy of the report sent to
me from France. The amount of time this would have taken is probably less than the time it took me to
do the research myself, but if the report had not anticipated my research, the overhead processing time
taken to collect all the references would have delayed my publication. To put it simply, if one waited for
all communication to quiesce before embarking upon one's own study, one would never begin anything.
My superfluity was consummated in the fall of 1988, when a member of the research group at INRIA
visited my department at Cornell. If I had not already presented my results to my research group, they
would have been able to learn them from him, anyway.

But this does not mean that I should not have done my research. Immanuel Kant elaborated what is
commonly termed "the Golden Rule" into his Categorical Imperative: Act as if the maxim of your action
were to become a universal law. If the Soviet and the Frenchmen and I all decided not to pursue
independent inquiry, our discovery never would have been made. Some amount of duplication is
inseparable from general progress. The more work we do, the more people there are who are doing it.
The more people there are, the longer it takes for information to be disseminated throughout the body of
thinkers. It is only the large amount of contemporary research that causes it to be so distributed and
partitioned by boundaries of sub-fields, nations, and languages. It is an ineluctable side-effect of the
Information Age.

This pressure towards specialisation and partitioning of knowledge brings me back to the separation of
science from the collection of disciplines now given the nebulous label "the humanities". Philosophy, as
it was begun by the Greeks, has been bisected, with natural philosophy evolving into what we now know
as the "hard" sciences, and everything else, the not-science, being lumped into "the humanities". We
have observed how each new publication increments the obscurity of all those that have come before it.
Specialisation is being pushed. Even within individual fields, it is becoming impossible to have a good
grasp of everything. Henri Poincaré in the early twentieth century was the last mathematician to be
familiar with all the areas of mathematical study of his time. Our educational system foists specialisation
on us at earlier and earlier times, before we are academically mature enough to handle it. In high school,
one is categorised either as a scientist or as an artist. Exploration is discouraged, usually not by
individuals, but by the educational system. How are you going to get ahead if you waste your time
dillydallying with all those irrelevant humanities? You have little enough time as it is--look at all that's
been added to your field even within the last few decades! The system does not leave students free to
decide for themselves what is relevant and what is not.

Pirsig talks about the university structure as "the Church of Reason". And indeed, universities, perched
on their hilltops with their imitation Gothic towers and their metaphorical ivory towers, their priests of
Reason with their medieval academic vestments, seem striving to give themselves airs of churches. (In
the nineteenth century, Comte tried to make the idea literal by replacing established religion with a
pseudo-religion surrounding his doctrine of positivism, reasoning from observable premises.) This
pretentious artifice has been irritating me ever since I arrived at Cornell. When I was in high school I
had had some limited experience of the university, and I believed that I saw my future within it. In
twelfth-grade English class I read Joyce's A Portrait of the Artist as a Young Man for the first time, and
was struck by a feeling of kinship with Joyce. His sense of the epiphanies of childhood was breathtaking

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (16 of 18)29/4/2010 11:10:08


Introduction

to me. In Stephen's fantasies of himself as a priest I recognised my ambitions for my future within the
universities. Although as a one-sided scientist I felt that I could understand Stephen's final rejection of
religion, I failed to see clearly what he had found outside of that structure. From that point on in the
story the book was just a book to me, a thing to study.

The revolt of students against universities in the 1960's was partly a Reformation of the Church of
Reason. Out with your golden chalices. Out with your dusty old classics. This revolt was the most
prominent manifestation of Reason's dissatisfaction with itself, in this case a dissatisfaction with the
arbitrary, intangible, constraining structure of the university that had persisted for centuries without
much questioning. But there are other, more subtle signals. The method of literary criticism known as
Deconstruction, for example, and the politics of its proponents within the academic structure, are a
reasonable reaction against Reason. Texts themselves are unimportant, because they all contain all
meanings, and therefore no meaning. (Remember the Universal Library.) Therefore, we rightly deny the
existence of any canonical objects of literary criticism; we banish all "classics". Same result, different
path. The Deconstructionists attack the artifice from within. (This is, to use one of their favourite
concepts in reference to texts, self-subversion.) The hippies, the naïve opponents of enforced structure,
went outside the artifice to attack it.

***

So what's the point? Why bother? We can never perfect libraries. We can never reach the optimal
algorithm. We can never exclude the possibility of duplication of effort and the consequent superfluity.
We can taste victory, tantalisingly, but we can never make it all the way. On all fronts, chaos
encroaches. What can we hope to attain? The only consolation that I am left with is the possibility of a
metaphysical-mystical existence. Art and literature, science and mathematics offer us grimy windows
onto beauty. It feels appropriate that through these glimpses we are being prepared for a final, total
appraoch to Beauty, the fundamental harmony. Given this possibility, the prospect of death is intriguing.
Death is the end of one's consciousness of the imperfection of human existence and expression, the
lifting of a burden. Whether or not some soul or spirit persists after death, it seems a joy to die on this
cross of imperfection that one has been compelled to carry.

Philosophers and theologians have bandied the Problem of Evil back and forth for ages: How might the
existence of God, by definition an omnipotent, perfectly good being, reconcile with the apparent
presence of evil in the world? How might it be that God would allow evil to exist? In general, analytical
responses to this problem work either by denying the existence of an omnipotent, perfectly good being
or by denying the existence of evil. A popular answer to this problem is the soul-building theodicy, an
elaboration of the old saw "The Lord works in mysterious ways." This is a denial of the existence of
evil--or, perhaps more properly, a denial of the evilness of what we view in this world as evil. Humans
in this world, say the proponents of soul-building, are still in a process of formation. The reason that we
must always be denied perfection and perfect happiness in this world is that we are not yet ready for it.
We cannot comprehend in what way we are not yet ready for it, but we can surmise that there is some
justification that lies beyond the scope of our current faculties. This view is remarkably similar to the
Buddhist view of life as something that happens to the spirit before it reaches a state of Enlightenment

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (17 of 18)29/4/2010 11:10:08


Introduction

and dispenses with its illusion of self. Evil must appear to fall randomly upon people who do not deserve
it in order that we may have no analytical, logical proof of the existence of the Divine.

The skeptic may observe that this argument cheats in a sense, by denying the applicability of analysis to
its subject. In this respect it resembles the argument of the pseudo-scientists that alien astronauts have
already visited the earth, but have recognised that our civilisation is too immature to handle contact with
them, so naturally there is no conclusive evidence of such alien visitation. Such arguments place
themselves outside the positivistic realm of traditional experimental verification. Nevertheless, one must
agree that the soul-building theodicy seems plausible from a philosophical point of view. We desperately
would like to believe it. But this desperate need to believe what seems intuitively plausible has been the
root of much confusion and error throughout history. It causes people, reputable scientists and
philosophers, to latch onto an apparent harmony and then to fight off the hidden harmony. What is there
to be said? Logic fails to apply in this case, and the question is left forever open.

To my reader I offer one primary piece of advice that I currently believe sound: Find the roots of your
interests. Study some history and philosophy. Have at least a modicum of understanding about where all
this came from and how your work might fit into all of it. Without such a perspective, you would be
groundless.

NEXT CHAPTER: 1. Systems

[1]All fragments from Heraclitus are from [Robinson 1968] chapter 5.


[2]Plato, in [Robinson 1968] 6.37.
[3][Pavic 1988] p. 244.
[4][Pavic 1988] p.244-245.
[5][Woolf 1927] p. 287.
[6]Cornell University summer session catalogue, 1989. p. 9.
[7]Peter Routman, "NeXT Computer Creator Demonstrates Technology", The Cornell Daily Sun,
February 13, 1989. p. 1.
[8][Harding 1931] p. 304-305.
[9][Goodrum & Dalrymple 1982] p. 309-310.
[10][Borges 1983] p. 81.

file:///C|/Documents%20and%20Settings/anthomel/Desktop/belmonte/Introduction.htm (18 of 18)29/4/2010 11:10:08

You might also like