You are on page 1of 7

IEEE Annals of the History of Computing 1058-6180/01/$10.

00 2001 IEEE 49
In an interview recently published in the
Annals, Bernie Galler remarked that, when
helping establish this journal, he fought for a
rule that its editors would not accept papers on
topics more recent than 15 years old.
1
In prac-
tical terms, this rule meant that when the
Annals began publishing in July 1979, early
issues would cover events up to and including
IBMs System/360 series, announced in 1964
and for which installations began in 1965. I
will say more on this restriction later, but for
now I want to note that it was coupled with an
assumption about where the eld began, which
resulted in the journals focusing on events that
occurred in a narrow range of years. The
founders and editorial board understood that
computing was an activity whose origins lay in
antiquity and coincided with the emergence of
mathematics. But they believed that this activ-
itys history became important enough to war-
rant its study only in the late 1940s. That
decade saw the completion of large, automatic
machines like the Harvard Mark I, the ENIAC
(Electronic Numerical Integrator And
Computer), the IBM SSEC (Selective Sequence
Electronic Calculator), and above all the
Cambridge EDSAC (Electronic Delay Storage
Automatic Calculator), developed under
Maurice Wilkes direction and in daily opera-
tion by 1949. The assumption behind the jour-
nals title was that computing meant more than
mechanical or slide rule calculation or punched
card tabulation, however important these were.
Computing also meant automatic operation,
which machines prior to 1940 either lacked or
had only to a rudimentary degree, but which
the EDSAC had in full measure.
2
Because I entered the eld just as the Annals
was getting under way, I recall those arguments
so well that I often forget that we need to artic-
ulate them to those who entered the eld later.
Although I had nothing to do with the found-
ing of the journal or with its explicit or implied
focus, I shared those assumptions. For me, the
field was established by two books that
appeared in the early 1970s: Brian Randells The
Origins of Digital Computers: Selected Papers
(Springer Verlag, 1975), and Herman H.
Goldstines The Computer from Pascal to von
Neumann (Princeton University Press, 1972).
Both were written by practitioners in the eld,
and both regarded the invention of the stored
program as the crucial event that brought
computing into existence. Randall ends his
book with a brief account of the EDSACs pub-
lic demonstration in June 1949, noting:
This paper [on the EDSAC] concludes the present
account of the origins of computers; however it
also marks the beginning of an era, an era during
which the digital computer, and its applications,
have developed far beyond the aspirations of the
pioneers whose work has been described in these
pages.
3
Goldstine begins his narrative well before
the 20th century, but the core of his narrative
concerns the wartime developments of the
ENIAC and the EDVAC, particularly how they
gave rise to the stored-program concept. As a
member of the teams that worked on those
machines, his allocation of credit was not unbi-
ased. Subsequent books and scholarly papers,
including several special issues of the Annals,
have dealt, at length, with the question of who
should get credit. But Goldstines and Randells
A View from 20 Years as a
Historian of Computing
Paul E. Ceruzzi
Smithsonian Institution
The authors reections go back to the late 1970s when the
electronic, stored program was itself only a few decades old and the
challenge, in the history of computing eld, was to convey computer
historys signicance and reach. The contemporary challenge is to
maintain historical standards, objectivity, and distance while keeping
abreast of technologys rapid changes. To survive, the eld must also
defend that the study of computing is separate from the computing
that permeates modern life via the Web.
emphasis on the stored program and on elec-
tronics remained valid, and I believe it was an
underlying, if unstated, assumption about
where the Annals would begin its focus as well.
The Annals thus established itself by focus-
ing on what now seems to be a brief moment
in history: from about 1946 to 1965, fewer than
20 years. Although by 2001 the journal can
now consider events of the mid-1980s, the
Annals retains a focus on those yearswhen
computing was characterized by mainframes,
batch, or online transaction processingby the
dominance of IBM and a handful of competi-
tors and by computing development primarily
in the US.
The beginning of computing
This emphasisconsidering computing
only from the 1940s onhas been criticized,
primarily for its implication that what hap-
pened before 1940 was not as worthy of study.
To its credit, the Annals has responded to this
criticism, and the field has gained strength
from those who responded with serious schol-
arship on earlier events. We now know a lot
more about Charles Babbage and his place in
history than we did when the Annals rst start-
ed publishing. (The naming of the research
institute devoted to the history of information
processing after Babbage, which seemed ques-
tionable at the time, now seems to have been
both prescient and wise.) We know about
Konrad Zuse and how his work fits into the
context of American developments of the
1930s and 1940s. We know about Alan Turings
work and what transpired at Bletchley Park,
although not as much as we would like to
know. We have a better sense of pre-World War
II and wartime analog computation and how it
ts into the larger picture.
4
Arthur Norberg, Jo
Anne Yates, and James Cortada, among others,
have demonstrated how the firms that domi-
nated computings early decadesincluding
IBM, Burroughs, NCR, and Remington Rand
had deep roots in punched card or mechanical
data processing operations in the first half of
the century. We know more about Herman
Hollerith and the origins of IBM, and more
about the US calculating and accounting
machines industry. Similarly, a healthy num-
ber of papers have covered European develop-
ments before 1945 and through the 1960s.
This focus on the stored-program principle,
and its role in dening the eld, was appropri-
ate and necessary. For those founders of the
Annals who were teaching computer science
and related elds, and for those who were sen-
ior gures in industry, this focus perhaps never
came up, but it was crucially important for
those involved in the academic study of histo-
ry. The study of history has been and continues
to be dominated by political and social issues.
The history of technology, if taught at all, typ-
ically occupies a token corner in history depart-
ments of all but the top US colleges and
universities (note that history of science is
often aligned with the sciences). Within the
history of technology, computer history might
therefore be in a corner of a corner.
Elsewhere I have related how, when intro-
ducing myself to colleagues in the faculty
lounge at the beginning of my academic career,
someone asked, Why the history of comput-
ingwhy not the history of washing
machines?
5
Perhaps he was ribbing me, but he
had a valid point. Computers at that time (ca.
1981) had hardly entered the publics con-
sciousness. Washing machines and other house-
hold appliances not only were much more
common, they also seemed to have obvious
effects on domestic life that were worthy of aca-
demic study. On a more serious note, I also recall
my dissertation adviser arguing that the auto-
mobile, not the computer, was the dening
technology of the 20th century, and, if I were to
argue otherwise, I had better have a good reason.
Those of us who are computer historians
intuitively know that the computer is some-
thing more than the washing machine, even
the automobile, and the reason is the stored-
program principle. A computer is not a single
machine but one of an infinite number of
machines, depending on the software written
for it. Those who brought this machine into
existence in the late 1940s did not predict the
current use of computers as communication
nodes for the World Wide Web, but they would
have no trouble understanding how a stored-
program computer is programmed to function
as such. That notion was expressed in the writ-
ings of Herb Simon, Alan J. Perlis, and Allen
Newell, who argued for the establishment of
the field of Computer Science based on the
computers complexity, variety, and richness.
6
Likewise, Saul Amarel, in the rst edition of the
Encyclopedia of Computer Science, noted that
the stored program digital computer provides
a methodologically adequate, as well as a realistic,
basis for the exploration for the exploration and
study of a great variety of concepts, schemes, and
techniques of information processing.
7
Papers on the origins and history of this
concept have appeared less frequently in the
Annals recently, as many of the controversies
50 IEEE Annals of the History of Computing
A View from 20 Years
surrounding its origins have been resolved (or
at least aired). Likewise, its importance as a
dividingactually, a startingpoint, for the
true history of computing has lost some of its
sharpness. But the stored-program principle
remains a valid focus for computings history.
Coverage of events since 1965
A different picture emerges in turning to
events that occurred after the Annals initial cut-
off point. Instead of smoothly sliding along the
15-year window, events that occurred after 1965
were unevenly documented and interpreted.
Part of this problem is inherent in the subject
material. Many assume that computing has fol-
lowed the empirical rule of Moores law, which
states that, since 1959, chip density has doubled
every 18 months. (See Figure 1.) It is dangerous
to assume that this doubling of components per
chip, which Gordon Moore first described in
1965, is equivalent to computing.
8
There are, or
should be, signicant aspects of computing that
have nothing to do with that trend, and history
suffers to the extent that historians fail to make
such a distinction. But the pace of innovation
in computing poses a challenge, and it cannot
be ignored.
A countering argument would say exponen-
tial growth has been the norm for all science
and technology, and historians of computing
therefore have it neither easier nor harder than
colleagues in related elds. That notion was best
expressed by Derek De Solla Price, in his classic
Science Since Babylon (Yale University Press,
1961, revised 1975). Price looked at numerous
indicators, especially the number of scientific
journals, as well as the number and size of jour-
nal-abstracting services like Chemical Abstracts,
and concluded that scientific knowledge was
doubling about every 15 years.
9
(See Figure 2.)
This observation led to a number of conclu-
sions, among them the familiar one that at any
given moment an exceptionally high propor-
tion of all scientists who ever lived are alive and
working. Another was that at some point total
saturation would existthe number of scien-
tists would surpass the number of human
beings on the planet (world population is also
growing exponentially but not as fast). Of
course, there was the conclusion that, even with
the growth of abstracting services, a crisis was
imminent. (Price was aware of the electronic
computers potential to forestall this crisis, but
the state of computing in 1961 was rather prim-
itive, and the computer could hardly have been
expected to solve the problem. This point was
reinforced to me recently in a personal com-
munication by Douglas Engelbart, who
OctoberDecember 2001 51
Figure 1. An early description of Moores law,
from Gordon E. Moore, Progress in Digital
Integrated Electronics, Technical Digest 1975,
Intl Electron Devices Meeting, 13 Dec. 1975,
Washington, D.C., p. 11. ( 1975 IEEE.)
Figure 2. Exponential Growth of Scientic
Knowledge, from D.J. De Solla Price, Little
Science, Big Science, Columbia Univ. Press, New
York, 1963, p. 10. ( 1963, Columbia University
Press; used with permission.)
observed that, at these doubling rates, the
amount of scientific knowledge in the world
will soon double every 30 days.)
An example might put this issue in context.
For many historians of technology, the seminal
book that founded that eld, comparable to
Randells or Goldstines for the history of com-
puting, was the two-volume work edited by
Melvin Kranzberg and Carroll Pursell, Technology
in Western Civilization (Oxford University Press,
1967). That work grew out of discussions
between Kranzberg and the US Armed Forces
Institute, which had asked for assistance in edu-
cating military ofcers about the place of tech-
nology in history. The head of the USAFI was
Edward Katzenbach, the older brother of
Nicholas Katzenbach, who had been Attorney
General at the time and had also served as IBMs
chief counsel during the IBM antitrust trials of
the 1970s.
10
Discussions between Katzenbach
and Kranzberg led to a proposal for a book whose
emphasis would be on Western Technology
with special consideration given to US technol-
ogy in the 19th and 20th centuries.
10
A couple
of the projects advisory committee members
objected to including anything concerning the
20th century but were overruled, in part bowing
to pressure from the Armed Forces Institute,
which sponsored the project. As the project
scope grew, the team decided to produce two vol-
umes. The cutoff was to be around 1865.
By 1966, when the book reached its nal
form, the second volume was devoted exclu-
sively to developments in the US, and the cutoff
had moved up to 1900. In its preface, the editors
state: the breaking point is not 1648, [or]
1815 but the beginning of the 20th century
The nature of technological history itself dic-
tated this division.
11
In an introductory essay,
Carroll Pursell notes that this division had to be
made because technological development
progresses not on an arithmetic scale, by accu-
mulation of machines or the steady improve-
ment of those already in existence, but that the
process was geometric or even logarithmic in
growth. (Emphasis in the original. He meant
exponential, not logarithmic.)
12
The second vol-
ume was Technology in the Twentieth
Century. Published in 1967, it had no mention
of the Internet, personal computer, or space
shuttle, among many other topics. On rereading
that volume for the rst time in many years, I
was struck by how much I had forgotten about
the technology that was invented and devel-
oped in the rst two-thirds of the 20th century.
If such a project were proposed today specif-
ically for the history of computing, rather than
the history of technology, where might the cut-
off point be? In informal discussions with uni-
versity press editors, other museum curators,
and historians working in the eld, the recur-
ring date I hear is 1990. That was when the cur-
rent configuration of desktop computing
stabilized: the basic workstation that is syn-
onymous with computer to the public. It was
also when networking of these machines, local-
ly through Ethernet or other LANs, and global-
ly through the Internet, became an integralif
not the deningaspect of a computer.
By analogy with Kranzberg and Pursell, a
comprehensive history of computing could be
published in two volumes, with volume I cov-
ering events up to 1990, volume II after 1990.
By the time such a work reached the publisher,
the editors would have decided to move up the
cutoff to about, say, 1995. Shortly after the two
volumes appear in bookstores, enough would
have happened to cause readers to clamor
either for a total revision or a third volume.
Anyone who tries to keep pace falls victim to
a modern version of Zenos Paradox. In the clas-
sical story, a runner, although fast, was never
able to reach the nish line because he rst had
to traverse one-half the distance to the end,
which took a nite time, and then one-half the
remaining distance, which again took a smaller
but still nite time, and so on. For the historian,
in the time between typesetting a book or jour-
nal issue and the time it reaches the reader,
enough has happened in computing to render
the history obsolete. Many recognize this and
embrace the solution of publishing electroni-
cally, thus telescoping that time down to zero.
But that is a false hope, as it does nothing to
compress the time spent thinking about and
organizing the historical material into a coher-
ent narrative. As I write this [April 2001], I am
witnessing the collapse of the dot-com bubble.
That collapse calls into serious question previ-
ous conclusions about the vitality and wealth-
generating capabilities of technology rms in,
for example, Silicon Valley. Studiesmany in
print or posted only months ago on the Web
that purport to explain the phenomenon of
Silicon Valley now appear hopelessly out of date.
Two other anecdotes further illustrate the
speed of change. In 1999, Internet pioneers
Robert Kahn and Vint Cerf were being presented
an award at the American Computer Museum in
Bozeman, Montana. While touring the museum,
Kahn and Cerf both commented on the high
quality of the exhibits and the presentation of
the artifacts, but Kahn then remarked that per-
haps the museum will have to change its name
to the American Internet Museum. To all of us at
that ceremony, it seemed that the computer was
52 IEEE Annals of the History of Computing
A View from 20 Years
losing its identity, becoming no more than a
component of the Internet, which was the real
story. The computers invention now seems to
resemble Ottos invention of the 4-cycle gasoline
engine. However signicant that was, his inven-
tion is known because it is the power source for
the automobile.
The other example concerns the exhibition
of computing at the Smithsonians National
Museum of American History, which opened in
1990 as the Information Age. When it opened
it represented the state of the art in exhibits at
the Smithsonian and broke new ground in its
use of interactive, networked workstations for
the visitors. But all agree that now, 11 years
later, this exhibit is out of date. Among the
plans for its refurbishment is a proposal to
remove large segments of computings history,
with its rich but busy display of old hardware,
and replace it with exhibitry onwhat else?
the Internet and the World Wide Web.
The creation myth
The above discussions bring us back to the
Annals policy, and the pros and cons of its
soundness, of restricting content to what hap-
pened 15 or more years ago. Among the many
computing developments that occurred since
1965, two stand out: the invention and spread
of the microprocessor-based PC, which has
found its way onto the desktop and into many
homes, and the development of networking.
The rst gave rise to rms that have dominat-
ed the industry for the past 20 years, including
Intel, Microsoft, and Dell. The second develop-
ment has become so interwoven with comput-
ing that much of the public views the terms
computing, Web, and Internet as synonymous.
The Annals has published excellent articles
about the invention of packet switching, the
Ethernet, and the role of the Defense Advanced
Research Projects Agency in establishing net-
working. Its coverage of the history of personal
computing and PC-based software companies
has not been as thorough. There is no shortage
of popular and journalistic accounts, including
television programs and Web pages, for these
topics. Indeed, there are far too many Web
pages purporting to explain the origins of the
Internets development for an individual to
assimilate. Moreover, the quality of these pages
varies. The earlier history of computing also
had its journalistic and popular accounts, but
many fewer, also of varying quality. It is a tes-
timony to scholars that they produced serious
work, which acknowledged the contributions
of journalists but which also was bolstered by
a theoretical framework.
Such a framework is lacking when it comes
to networking and the personal computer. For
example, a well-known paper on this topic was
published not in the Annals but in the
Communications of the ACM, and it argues that
the rst PC was the Xerox Alto.
13
The editors of
that journal, like many others, look at the com-
puting milieu today and see all its antecedents
in the Alto. What they fail to see is that the rst
wave of PCs, exemplified by the IBM PC and
especially the XT and its clones, owed nothing
to research done at Xerox PARC.
The elevation of Xerox PARCs role is part of
what I call a creation myth for the history of
modern computing. To summarize the myth
risks distorting it, but essentially it is this:
Todays Information Age did not inevitably
result from progress in computer hardware but
from the labors of a handful of people possess-
ing extraordinary vision and drive. Starting
with the computer of the 1950san expensive
mainframe under centralized controlvision-
aries created a communications device that
worked symbiotically with its users, amplifying
and augmenting human intellect and capabili-
ties. These far-sighted individuals began imple-
menting that vision by developing time
sharing, which later evolved into locally net-
worked computers, and finally to a globally
interconnected network of computers. In par-
allel, these visionaries also developed ways to
make the computer more accessible to its users.
They did so by moving from the punched card
as the means of access to a modified teletype
keyboard and printer, and later to a combina-
tion of keyboard, mouse, and cathode ray tube.
In doing so, they also developed graphical
methods of interaction to complement the
mainframe eras text-based interfaces.
Much in this story rings true. Although the
focus is often Xerox PARC and the rest of Silicon
Valley, many accounts correctly give proper
emphasis to earlier events that occurred at the
Massachusetts Institute of Technology, in the
development of SAGE and time sharing. A spe-
cial place of honor is typically accorded to
Project MAC at MIT, where an effort was made
to redirect computing toward conversational,
interactive use. One of the best contributions the
Annals has made to the history of recent com-
puting was its set of interviews with those
involved with CTSS and Project MAC (vol. 14,
numbers 1 and 2, 1992). An earlier special issue
on SAGE (vol. 5, number 4, 1983), as well as
recent books by Agatha and Thomas Parke
Hughes, Paul Edwards, and Kent Redmond and
Thomas Smith, likewise provide valuable context
for the transition from batch-oriented punched
OctoberDecember 2001 53
card machines to interactive and networked
computing.
14
The most eloquent statement of
the creation myth is found in a handsome book-
let, by Simson Garnkel, written to commemo-
rate the 35th anniversary of the founding of
MITs Laboratory for Computer Science.
15
I have chosen creation myth deliberately; I
dont know if it is true or not. Just as historians
of computing once focused on the genesis of
the stored-program principle, historians today
should focus on this story, which offers much
to explain where we are and where we might be
heading. Such a focus can give us at least some
path through the trackless jungle, as Mike
Mahoney calls it,
16
of current computing by
offering a way to tie together disparate themes.
These include the role of DARPA in computing,
the relative places in history for MIT and its
contemporaries on the West Coast, including
Douglas Engelbarts Augmentation Research
Center, Xerox PARC, the Rand Corporation,
and so on. The myth brings in IBM, not as dom-
inant as it once was but as the chief advocate of
batch processing, now meeting challenges with
varying success. The myth gives an appropriate
place to the minicomputer companies like DEC
and Data General, who I always felt were slight-
ed by historians who focused on IBM and the
BUNCH (Burroughs, Univac, NCR, Control
Data, and Honeywell). The myth provides a
place for software, not so much the history of
programming languages like Fortran or Cobol,
but operating systems and languages like Unix,
C, the Macintosh interface, and Windows, as
well as application programs like Lotus 1-2-3.
Such a focus is not perfect, and that brings me
back to the books and articles I cited. Why does
the notion that the Xerox Alto was the rst PC
not sound quite right? One clue appears in
Garnkels book about MITs Lab for Computer
Science. In the preface, Garnkel mentions over-
hearing a conversation at a Cambridge coffee
shop, in which two businessmen discuss the
pioneering role of Bill Gates, Microsofts founder
and head, in inventing the Windows operating
system and bringing it to the market. To
Garnkel, this conversation revealed how little
people know of the scientic underpinnings for
modern computingeven when the research
was conducted just a few miles from where they
[the businessmen] were sitting.
17
The question of just where and by whom
the Windows interface was invented is one of
the big questions implied by the creation myth,
and Garnkel has every right to be concerned
about setting the record straight. But in doing
so, he, like those who give similar emphasis to
Xerox PARC, is missing something. Other than
being overheard (and dismissed) in the coffee
shop conversation, Bill Gates does not appear
in Garfinkels story. But Gates and Paul Allen,
the founders of Microsoft, were in Cambridge
in the early 1970s, when the Laboratory for
Computer Science was being established. Gates
was a student at Harvard; Allen was working at
Honeywell a few miles away. The two young
men were dedicated to bringing about a trans-
formation of computing, just like those at
Project MAC. They may have known of the
work going on down the road at MIT. Perhaps
they sat in the same coffee shop where
Garnkel later overheard the two businessmen.
Gates and Allan surveyed the state of comput-
ing in 1975 and decided to leave Cambridge
and go where computings future was being
created: Albuquerque, New Mexico, where the
Altair, a PC based on an Intel processor, was
being assembled and sold. The personal com-
puter was not part of Project MACs vision, but
it was the center of Gatess and Allens vision,
and of their company, Micro-soft [sic].
Where does this story t? We cannot ignore
Microsoft or Intel in relating the history of mod-
ern computing, but we might have to admit that
neither they, nor companies like Dell and
Compaq, owe much to Xerox PARC, or MIT for
that matter (at least not directly). The desire to
write history from the present backwards, cou-
pled with Microsofts and Intels powerful pub-
lic relations efforts, plus the withdrawal of Ed
Roberts (Altairs inventor) from this debate, have
led many to conclude that Ed Roberts, like John
Mauchly before him, did nothing. Furthermore,
this would imply that Intel would have become
the dominant architect of computing anyway,
and the same would be true for Microsofts dom-
ination of software. Doug Engelbart, who was at
the forefront of building networked, interactive
computing, once told of how he regarded the
PC phenomenon with horror, as it went against
everything he was trying to do with comput-
ers.
18
Historians would be well advised to con-
sider the difculties in networking early PCs
based on the Intel 8088 processor, and how
some advocates regarded networking as an
evila reminder of the mainframe attitudes that
they were trying to break away from.
Take another look
How important this all was is unknown, but
we need to nd out. Todays scholars in comput-
ings history could construct a narrative recon-
ciling these events, just as the rst generation of
historians reconciled the convergent stories
behind the stored-program, electronic digital
computer. Developing this story might damage
54 IEEE Annals of the History of Computing
A View from 20 Years
OctoberDecember 2001 55
the place in history of MIT, time sharing, ARPA,
and Xerox, or it might not. But the PCs inven-
tion must be addressed. In spite of all that has
been written about Gates, Microsoft, and the
invention of the PC, historians have not yet
managed to make a coherent narrative that rec-
onciles most of what has happened in the past
20 years. With the structure I have outlined
above, we can achieve that. Doing so may not be
enough to ensure that the Annals, or the study of
the history of computing as a separate discipline,
will last another 20 years. Perhaps the digital
computer will take its place alongside the four-
cycle gasoline engineor the washing machine.
Or perhaps historians will abandon a focus on
hardware (and associated programming lan-
guages and systems) and focus their scholarship
on the culture of the World Wide Web and
cyberspace. If computing becomes a thread in
the fabric of daily life and therefore invisible, as
current researchers at the MIT Media Lab claim
will happen, it will not be easy to maintain a
journal. Regardless of how computing evolves,
much remains to explain how computing devel-
oped to its present stage, never mind to a hypo-
thetical point of invisibility in the future. The
question of whether the subject was worthy of
study was answered afrmatively long ago. The
question of whether it will remain so should also
be answered in the afrmative, as long as the
practitioners of its history provide a framework
on which to tell this story.
References and notes
1. B. Galler, A Career Interview with Bernie Galler,
IEEE Annals of the History of Computing, vol. 23,
no. 1, Jan.Mar. 2001, p. 31.
2. P.E. Ceruzzi, Reckoners: The Prehistory of the Digi-
tal Computer, from Relays to the Stored Program
Concept, 19351945, Greenwood Press,
Westport, Conn., 1983.
3. B. Randell, The Origins of Digital Computers:
Selected Papers, 2nd ed., 353, Springer-Verlag,
Berlin, 1975.
4. J.A.N Lee, ed., Special Section: Analog Comput-
ers, IEEE Annals of the History of Computing, vol.
15, no. 2, Apr.June 1993, pp. 8-52.
5. P.E. Ceruzzi, A History of Modern Computing, vol.
2, MIT Press, Cambridge, Mass., 1998.
6. A. Newell, Computer Science, Science 157, 22
Sept. 1967, pp. 1373-1374; also see H.A. Simon,
Sciences of the Articial, MIT Press, Cambridge,
Mass., 1969.
7. S. Amarel, Computer Science, Encyclopedia of
Computer Science, A. Ralston, ed., Van Nostrand,
New York, 1976, p. 318.
8. G. Moore, Progress in Digital Integrated
Electronics, Technical Digest, IEEE Press, Piscat-
away, N.J., 1975, p. 11-13.
9. D.J. De Solla Price, Science Since Babylon, 2nd
ed., Yale University Press, New Haven, Conn.,
1975, p. 169.
10. M. Kranzberg, Kranzberg Papers, 266, 301,
1963; File: Proposal, MK to USAFI, Madison, Wis.;
Natl Museum of American History Archives.
11. M. Kranzberg and C.W. Pursell Jr., eds., Technolo-
gy in Western Civilization, vol. I, Oxford, New
York, 1967, pp. vi-vii.
12. M. Kranzberg and C.W. Pursell Jr., eds., Technolo-
gy in Western Civilization, vol. II, Oxford, New
York, 1967, p. 3.
13. L. Press, Before the Altair: The History of Person-
al Computing, Comm. ACM, vol. 36, no. 9,
1993, pp. 27-33.
14. T.P. Hughes, Rescuing Prometheus, Pantheon,
New York, 1998; P. Edwards, The Closed World:
Computers and the Politics of Discourse in Cold War
America, MIT Press, Cambridge, Mass., 1996;
K.C. Redmond and T.M. Smith, From Whirlwind to
MITRE: The R&D Story of the SAGE Air Defense
Computer, MIT Press, Cambridge, Mass., 2000.
15. S. Garnkel, Architects of the Information Society:
Thirty-Five Years of the Laboratory for Computer Sci-
ence at MIT, MIT Press, Cambridge, Mass., 1999.
16. M.S. Mahoney, The History of Computing in the
History of Technology, Annals of the History of
Computing, vol. 10, no. 2, 1988, pp. 113-125.
17. S. Garnkel, Architects of the Information Society:
Thirty-Five Years of the Laboratory for Computer
Science at MIT.
18. D. Engelbart, personal communication, 1 May
1998.
Paul E. Ceruzzi is curator of
aerospace electronics and com-
puting at the Smithsonian
Institutions National Air and
Space Museum in Washington,
D.C. He recently published A
History of Modern Computing
(MIT Press, Cambridge, Mass.).
Currently, he is working on a research project to doc-
ument the history of systems engineering rms locat-
ed in the vicinity of Tysons Corner, Virginia.
Readers may contact Paul Ceruzzi at p.ceruzzi@
computer.org.
For further information on this or any other com-
puting topic, please visit our Digital Library at
http://computer.org/publications/dlib.

You might also like