"a view from twenty years as a historian of computing"

7
IEEE Annals of the History of Computing 1058-6180/01/$10.00 © 2001 IEEE 49 In an interview recently published in the Annals, Bernie Galler remarked that, when helping establish this journal, he fought for a rule that its editors would not accept papers on topics more recent than 15 years old. 1 In prac- tical terms, this rule meant that when the Annals began publishing in July 1979, early issues would cover events up to and including IBM’s System/360 series, announced in 1964 and for which installations began in 1965. I will say more on this restriction later, but for now I want to note that it was coupled with an assumption about where the field began, which resulted in the journal’s focusing on events that occurred in a narrow range of years. The founders and editorial board understood that computing was an activity whose origins lay in antiquity and coincided with the emergence of mathematics. But they believed that this activ- ity’s history became important enough to war- rant its study only in the late 1940s. That decade saw the completion of large, automatic machines like the Harvard Mark I, the ENIAC (Electronic Numerical Integrator And Computer), the IBM SSEC (Selective Sequence Electronic Calculator), and above all the Cambridge EDSAC (Electronic Delay Storage Automatic Calculator), developed under Maurice Wilkes’ direction and in daily opera- tion by 1949. The assumption behind the jour- nal’s title was that computing meant more than mechanical or slide rule calculation or punched card tabulation, however important these were. Computing also meant automatic operation, which machines prior to 1940 either lacked or had only to a rudimentary degree, but which the EDSAC had in full measure. 2 Because I entered the field just as the Annals was getting under way, I recall those arguments so well that I often forget that we need to artic- ulate them to those who entered the field later. Although I had nothing to do with the found- ing of the journal or with its explicit or implied focus, I shared those assumptions. For me, the field was established by two books that appeared in the early 1970s: Brian Randell’s The Origins of Digital Computers: Selected Papers (Springer Verlag, 1975), and Herman H. Goldstine’s The Computer from Pascal to von Neumann (Princeton University Press, 1972). Both were written by practitioners in the field, and both regarded the invention of the stored program as the crucial event that brought computing into existence. Randall ends his book with a brief account of the EDSAC’s pub- lic demonstration in June 1949, noting: This paper [on the EDSAC] concludes the present account of the origins of computers; however it also marks the beginning of an era, an era during which the digital computer, and its applications, have developed far beyond the aspirations of the pioneers whose work has been described in these pages. 3 Goldstine begins his narrative well before the 20th century, but the core of his narrative concerns the wartime developments of the ENIAC and the EDVAC, particularly how they gave rise to the stored-program concept. As a member of the teams that worked on those machines, his allocation of credit was not unbi- ased. Subsequent books and scholarly papers, including several special issues of the Annals, have dealt, at length, with the question of who should get credit. But Goldstine’s and Randell’s A View from 20 Years as a Historian of Computing Paul E. Ceruzzi Smithsonian Institution The author’s reflections go back to the late 1970s when the electronic, stored program was itself only a few decades old and the challenge, in the history of computing field, was to convey computer history’s significance and reach. The contemporary challenge is to maintain historical standards, objectivity, and distance while keeping abreast of technology’s rapid changes. To survive, the field must also defend that the study of computing is separate from the computing that permeates modern life via the Web.

Upload: mejilsa

Post on 19-Jan-2016

5 views

Category:

Documents


1 download

DESCRIPTION

Paul Ceruzzi

TRANSCRIPT

Page 1: "A View From Twenty Years as a Historian of Computing"

IEEE Annals of the History of Computing 1058-6180/01/$10.00 © 2001 IEEE 49

In an interview recently published in theAnnals, Bernie Galler remarked that, whenhelping establish this journal, he fought for arule that its editors would not accept papers ontopics more recent than 15 years old.1 In prac-tical terms, this rule meant that when theAnnals began publishing in July 1979, earlyissues would cover events up to and includingIBM’s System/360 series, announced in 1964and for which installations began in 1965. Iwill say more on this restriction later, but fornow I want to note that it was coupled with anassumption about where the field began, whichresulted in the journal’s focusing on events thatoccurred in a narrow range of years. Thefounders and editorial board understood thatcomputing was an activity whose origins lay inantiquity and coincided with the emergence ofmathematics. But they believed that this activ-ity’s history became important enough to war-rant its study only in the late 1940s. Thatdecade saw the completion of large, automaticmachines like the Harvard Mark I, the ENIAC(Electronic Numerical Integrator AndComputer), the IBM SSEC (Selective SequenceElectronic Calculator), and above all theCambridge EDSAC (Electronic Delay StorageAutomatic Calculator), developed underMaurice Wilkes’ direction and in daily opera-tion by 1949. The assumption behind the jour-nal’s title was that computing meant more thanmechanical or slide rule calculation or punchedcard tabulation, however important these were.Computing also meant automatic operation,which machines prior to 1940 either lacked orhad only to a rudimentary degree, but whichthe EDSAC had in full measure.2

Because I entered the field just as the Annals

was getting under way, I recall those argumentsso well that I often forget that we need to artic-ulate them to those who entered the field later.Although I had nothing to do with the found-ing of the journal or with its explicit or impliedfocus, I shared those assumptions. For me, thefield was established by two books thatappeared in the early 1970s: Brian Randell’s TheOrigins of Digital Computers: Selected Papers(Springer Verlag, 1975), and Herman H.Goldstine’s The Computer from Pascal to vonNeumann (Princeton University Press, 1972).Both were written by practitioners in the field,and both regarded the invention of the storedprogram as the crucial event that broughtcomputing into existence. Randall ends hisbook with a brief account of the EDSAC’s pub-lic demonstration in June 1949, noting:

This paper [on the EDSAC] concludes the presentaccount of the origins of computers; however italso marks the beginning of an era, an era duringwhich the digital computer, and its applications,have developed far beyond the aspirations of thepioneers whose work has been described in thesepages.3

Goldstine begins his narrative well beforethe 20th century, but the core of his narrativeconcerns the wartime developments of theENIAC and the EDVAC, particularly how theygave rise to the stored-program concept. As amember of the teams that worked on thosemachines, his allocation of credit was not unbi-ased. Subsequent books and scholarly papers,including several special issues of the Annals,have dealt, at length, with the question of whoshould get credit. But Goldstine’s and Randell’s

A View from 20 Years as a Historian of ComputingPaul E. CeruzziSmithsonian Institution

The author’s reflections go back to the late 1970s when theelectronic, stored program was itself only a few decades old and thechallenge, in the history of computing field, was to convey computerhistory’s significance and reach. The contemporary challenge is tomaintain historical standards, objectivity, and distance while keepingabreast of technology’s rapid changes. To survive, the field must alsodefend that the study of computing is separate from the computingthat permeates modern life via the Web.

Page 2: "A View From Twenty Years as a Historian of Computing"

emphasis on the stored program and on elec-tronics remained valid, and I believe it was anunderlying, if unstated, assumption aboutwhere the Annals would begin its focus as well.

The Annals thus established itself by focus-ing on what now seems to be a brief momentin history: from about 1946 to 1965, fewer than20 years. Although by 2001 the journal cannow consider events of the mid-1980s, theAnnals retains a focus on those years—whencomputing was characterized by mainframes,batch, or online transaction processing—by thedominance of IBM and a handful of competi-tors and by computing development primarilyin the US.

The beginning of computingThis emphasis—considering computing

only from the 1940s on—has been criticized,primarily for its implication that what hap-pened before 1940 was not as worthy of study.To its credit, the Annals has responded to thiscriticism, and the field has gained strengthfrom those who responded with serious schol-arship on earlier events. We now know a lotmore about Charles Babbage and his place inhistory than we did when the Annals first start-ed publishing. (The naming of the researchinstitute devoted to the history of informationprocessing after Babbage, which seemed ques-tionable at the time, now seems to have beenboth prescient and wise.) We know aboutKonrad Zuse and how his work fits into thecontext of American developments of the1930s and 1940s. We know about Alan Turing’swork and what transpired at Bletchley Park,although not as much as we would like toknow. We have a better sense of pre-World WarII and wartime analog computation and how itfits into the larger picture.4 Arthur Norberg, JoAnne Yates, and James Cortada, among others,have demonstrated how the firms that domi-nated computing’s early decades—includingIBM, Burroughs, NCR, and Remington Rand—had deep roots in punched card or mechanicaldata processing operations in the first half ofthe century. We know more about HermanHollerith and the origins of IBM, and moreabout the US calculating and accountingmachines industry. Similarly, a healthy num-ber of papers have covered European develop-ments before 1945 and through the 1960s.

This focus on the stored-program principle,and its role in defining the field, was appropri-ate and necessary. For those founders of theAnnals who were teaching computer scienceand related fields, and for those who were sen-ior figures in industry, this focus perhaps never

came up, but it was crucially important forthose involved in the academic study of histo-ry. The study of history has been and continuesto be dominated by political and social issues.The history of technology, if taught at all, typ-ically occupies a token corner in history depart-ments of all but the top US colleges anduniversities (note that history of science isoften aligned with the sciences). Within thehistory of technology, computer history mighttherefore be in a corner of a corner.

Elsewhere I have related how, when intro-ducing myself to colleagues in the facultylounge at the beginning of my academic career,someone asked, Why the history of comput-ing—why not the history of washingmachines?5 Perhaps he was ribbing me, but hehad a valid point. Computers at that time (ca.1981) had hardly entered the public’s con-sciousness. Washing machines and other house-hold appliances not only were much morecommon, they also seemed to have obviouseffects on domestic life that were worthy of aca-demic study. On a more serious note, I also recallmy dissertation adviser arguing that the auto-mobile, not the computer, was the definingtechnology of the 20th century, and, if I were toargue otherwise, I had better have a good reason.

Those of us who are computer historiansintuitively know that the computer is some-thing more than the washing machine, eventhe automobile, and the reason is the stored-program principle. A computer is not a singlemachine but one of an infinite number ofmachines, depending on the software writtenfor it. Those who brought this machine intoexistence in the late 1940s did not predict thecurrent use of computers as communicationnodes for the World Wide Web, but they wouldhave no trouble understanding how a stored-program computer is programmed to functionas such. That notion was expressed in the writ-ings of Herb Simon, Alan J. Perlis, and AllenNewell, who argued for the establishment ofthe field of “Computer Science” based on thecomputer’s complexity, variety, and richness.6

Likewise, Saul Amarel, in the first edition of theEncyclopedia of Computer Science, noted that

… the stored program digital computer … providesa methodologically adequate, as well as a realistic,basis for the exploration for the exploration andstudy of a great variety of concepts, schemes, andtechniques of information processing.7

Papers on the origins and history of thisconcept have appeared less frequently in theAnnals recently, as many of the controversies

50 IEEE Annals of the History of Computing

A View from 20 Years

Page 3: "A View From Twenty Years as a Historian of Computing"

surrounding its origins have been resolved (orat least aired). Likewise, its importance as adividing—actually, a starting—point, for thetrue history of computing has lost some of itssharpness. But the stored-program principleremains a valid focus for computing’s history.

Coverage of events since 1965A different picture emerges in turning to

events that occurred after the Annals’ initial cut-off point. Instead of smoothly sliding along the15-year window, events that occurred after 1965were unevenly documented and interpreted.Part of this problem is inherent in the subjectmaterial. Many assume that computing has fol-lowed the empirical rule of Moore’s law, whichstates that, since 1959, chip density has doubledevery 18 months. (See Figure 1.) It is dangerousto assume that this doubling of components perchip, which Gordon Moore first described in1965, is equivalent to computing.8 There are, orshould be, significant aspects of computing thathave nothing to do with that trend, and historysuffers to the extent that historians fail to makesuch a distinction. But the pace of innovationin computing poses a challenge, and it cannotbe ignored.

A countering argument would say exponen-tial growth has been the norm for all scienceand technology, and historians of computingtherefore have it neither easier nor harder thancolleagues in related fields. That notion was bestexpressed by Derek De Solla Price, in his classicScience Since Babylon (Yale University Press,1961, revised 1975). Price looked at numerousindicators, especially the number of scientificjournals, as well as the number and size of jour-nal-abstracting services like Chemical Abstracts,and concluded that scientific knowledge wasdoubling about every 15 years.9 (See Figure 2.)

This observation led to a number of conclu-sions, among them the familiar one that at anygiven moment an exceptionally high propor-tion of all scientists who ever lived are alive andworking. Another was that at some point totalsaturation would exist—the number of scien-tists would surpass the number of humanbeings on the planet (world population is alsogrowing exponentially but not as fast). Ofcourse, there was the conclusion that, even withthe growth of abstracting services, a crisis wasimminent. (Price was aware of the electroniccomputer’s potential to forestall this crisis, butthe state of computing in 1961 was rather prim-itive, and the computer could hardly have beenexpected to solve the problem. This point wasreinforced to me recently in a personal com-munication by Douglas Engelbart, who

October–December 2001 51

Figure 1. An early description of Moore’s law,from Gordon E. Moore, “Progress in DigitalIntegrated Electronics,” Technical Digest 1975,Int’l Electron Devices Meeting, 1–3 Dec. 1975,Washington, D.C., p. 11. (© 1975 IEEE.)

Figure 2. “Exponential Growth of ScientificKnowledge,” from D.J. De Solla Price, LittleScience, Big Science, Columbia Univ. Press, NewYork, 1963, p. 10. (© 1963, Columbia UniversityPress; used with permission.)

Page 4: "A View From Twenty Years as a Historian of Computing"

observed that, at these doubling rates, theamount of scientific knowledge in the worldwill soon double every 30 days.)

An example might put this issue in context.For many historians of technology, the seminalbook that founded that field, comparable toRandell’s or Goldstine’s for the history of com-puting, was the two-volume work edited byMelvin Kranzberg and Carroll Pursell, Technologyin Western Civilization (Oxford University Press,1967). That work grew out of discussionsbetween Kranzberg and the US Armed ForcesInstitute, which had asked for assistance in edu-cating military officers about the place of tech-nology in history. The head of the USAFI wasEdward Katzenbach, the older brother ofNicholas Katzenbach, who had been AttorneyGeneral at the time and had also served as IBM’schief counsel during the IBM antitrust trials ofthe 1970s.10 Discussions between Katzenbachand Kranzberg led to a proposal for a book whoseemphasis would be “… on Western Technologywith special consideration given to US technol-ogy in the 19th and 20th centuries.”10 A coupleof the project’s advisory committee membersobjected to including anything concerning the20th century but were overruled, in part bowingto pressure from the Armed Forces Institute,which sponsored the project. As the projectscope grew, the team decided to produce two vol-umes. The cutoff was to be around 1865.

By 1966, when the book reached its finalform, the second volume was devoted exclu-sively to developments in the US, and the cutoffhad moved up to 1900. In its preface, the editorsstate: “…the breaking point is not 1648, … [or]1815 … but the beginning of the 20th century… The nature of technological history itself dic-tated this division.”11 In an introductory essay,Carroll Pursell notes that this division had to bemade because “… technological developmentprogresses not on an arithmetic scale, by accu-mulation of machines or the steady improve-ment of those already in existence, but that theprocess was geometric or even logarithmic ingrowth.” (Emphasis in the original. He meantexponential, not logarithmic.)12 The second vol-ume was “Technology in the TwentiethCentury.” Published in 1967, it had no mentionof the Internet, personal computer, or spaceshuttle, among many other topics. On rereadingthat volume for the first time in many years, Iwas struck by how much I had forgotten aboutthe technology that was invented and devel-oped in the first two-thirds of the 20th century.

If such a project were proposed today specif-ically for the history of computing, rather thanthe history of technology, where might the cut-

off point be? In informal discussions with uni-versity press editors, other museum curators,and historians working in the field, the recur-ring date I hear is 1990. That was when the cur-rent configuration of desktop computingstabilized: the basic workstation that is syn-onymous with computer to the public. It wasalso when networking of these machines, local-ly through Ethernet or other LANs, and global-ly through the Internet, became an integral—ifnot the defining—aspect of a computer.

By analogy with Kranzberg and Pursell, acomprehensive history of computing could bepublished in two volumes, with volume I cov-ering events up to 1990, volume II after 1990.By the time such a work reached the publisher,the editors would have decided to move up thecutoff to about, say, 1995. Shortly after the twovolumes appear in bookstores, enough wouldhave happened to cause readers to clamoreither for a total revision or a third volume.

Anyone who tries to keep pace falls victim toa modern version of Zeno’s Paradox. In the clas-sical story, a runner, although fast, was neverable to reach the finish line because he first hadto traverse one-half the distance to the end,which took a finite time, and then one-half theremaining distance, which again took a smallerbut still finite time, and so on. For the historian,in the time between typesetting a book or jour-nal issue and the time it reaches the reader,enough has happened in computing to renderthe history obsolete. Many recognize this andembrace the solution of publishing electroni-cally, thus telescoping that time down to zero.But that is a false hope, as it does nothing tocompress the time spent thinking about andorganizing the historical material into a coher-ent narrative. As I write this [April 2001], I amwitnessing the collapse of the dot-com bubble.That collapse calls into serious question previ-ous conclusions about the vitality and wealth-generating capabilities of technology firms in,for example, Silicon Valley. Studies—many inprint or posted only months ago on the Web—that purport to explain the phenomenon ofSilicon Valley now appear hopelessly out of date.

Two other anecdotes further illustrate thespeed of change. In 1999, Internet pioneersRobert Kahn and Vint Cerf were being presentedan award at the American Computer Museum inBozeman, Montana. While touring the museum,Kahn and Cerf both commented on the highquality of the exhibits and the presentation ofthe artifacts, but Kahn then remarked that per-haps the museum will have to change its nameto the American Internet Museum. To all of us atthat ceremony, it seemed that the computer was

52 IEEE Annals of the History of Computing

A View from 20 Years

Page 5: "A View From Twenty Years as a Historian of Computing"

losing its identity, becoming no more than acomponent of the Internet, which was the realstory. The computer’s invention now seems toresemble Otto’s invention of the 4-cycle gasolineengine. However significant that was, his inven-tion is known because it is the power source forthe automobile.

The other example concerns the exhibitionof computing at the Smithsonian’s NationalMuseum of American History, which opened in1990 as the “Information Age.” When it openedit represented the state of the art in exhibits atthe Smithsonian and broke new ground in itsuse of interactive, networked workstations forthe visitors. But all agree that now, 11 yearslater, this exhibit is out of date. Among theplans for its refurbishment is a proposal toremove large segments of computing’s history,with its rich but “busy” display of old hardware,and replace it with exhibitry on—what else?—the Internet and the World Wide Web.

The creation mythThe above discussions bring us back to the

Annals’ policy, and the pros and cons of itssoundness, of restricting content to what hap-pened 15 or more years ago. Among the manycomputing developments that occurred since1965, two stand out: the invention and spreadof the microprocessor-based PC, which hasfound its way onto the desktop and into manyhomes, and the development of networking.The first gave rise to firms that have dominat-ed the industry for the past 20 years, includingIntel, Microsoft, and Dell. The second develop-ment has become so interwoven with comput-ing that much of the public views the termscomputing, Web, and Internet as synonymous.

The Annals has published excellent articlesabout the invention of packet switching, theEthernet, and the role of the Defense AdvancedResearch Projects Agency in establishing net-working. Its coverage of the history of personalcomputing and PC-based software companieshas not been as thorough. There is no shortageof popular and journalistic accounts, includingtelevision programs and Web pages, for thesetopics. Indeed, there are far too many Webpages purporting to explain the origins of theInternet’s development for an individual toassimilate. Moreover, the quality of these pagesvaries. The earlier history of computing alsohad its journalistic and popular accounts, butmany fewer, also of varying quality. It is a tes-timony to scholars that they produced seriouswork, which acknowledged the contributionsof journalists but which also was bolstered bya theoretical framework.

Such a framework is lacking when it comesto networking and the personal computer. Forexample, a well-known paper on this topic waspublished not in the Annals but in theCommunications of the ACM, and it argues thatthe first PC was the Xerox Alto.13 The editors ofthat journal, like many others, look at the com-puting milieu today and see all its antecedentsin the Alto. What they fail to see is that the firstwave of PCs, exemplified by the IBM PC andespecially the XT and its clones, owed nothingto research done at Xerox PARC.

The elevation of Xerox PARC’s role is part ofwhat I call a creation myth for the history ofmodern computing. To summarize the mythrisks distorting it, but essentially it is this:Today’s Information Age did not inevitablyresult from progress in computer hardware butfrom the labors of a handful of people possess-ing extraordinary vision and drive. Startingwith the computer of the 1950s—an expensivemainframe under centralized control—vision-aries created a communications device thatworked symbiotically with its users, amplifyingand augmenting human intellect and capabili-ties. These far-sighted individuals began imple-menting that vision by developing timesharing, which later evolved into locally net-worked computers, and finally to a globallyinterconnected network of computers. In par-allel, these visionaries also developed ways tomake the computer more accessible to its users.They did so by moving from the punched cardas the means of access to a modified teletypekeyboard and printer, and later to a combina-tion of keyboard, mouse, and cathode ray tube.In doing so, they also developed graphicalmethods of interaction to complement themainframe era’s text-based interfaces.

Much in this story rings true. Although thefocus is often Xerox PARC and the rest of SiliconValley, many accounts correctly give properemphasis to earlier events that occurred at theMassachusetts Institute of Technology, in thedevelopment of SAGE and time sharing. A spe-cial place of honor is typically accorded toProject MAC at MIT, where an effort was madeto redirect computing toward conversational,interactive use. One of the best contributions theAnnals has made to the history of recent com-puting was its set of interviews with thoseinvolved with CTSS and Project MAC (vol. 14,numbers 1 and 2, 1992). An earlier special issueon SAGE (vol. 5, number 4, 1983), as well asrecent books by Agatha and Thomas ParkeHughes, Paul Edwards, and Kent Redmond andThomas Smith, likewise provide valuable contextfor the transition from batch-oriented punched

October–December 2001 53

Page 6: "A View From Twenty Years as a Historian of Computing"

card machines to interactive and networkedcomputing.14 The most eloquent statement ofthe creation myth is found in a handsome book-let, by Simson Garfinkel, written to commemo-rate the 35th anniversary of the founding ofMIT’s Laboratory for Computer Science.15

I have chosen creation myth deliberately; Idon’t know if it is true or not. Just as historiansof computing once focused on the genesis ofthe stored-program principle, historians todayshould focus on this story, which offers muchto explain where we are and where we might beheading. Such a focus can give us at least somepath through the “trackless jungle,” as MikeMahoney calls it,16 of current computing byoffering a way to tie together disparate themes.These include the role of DARPA in computing,the relative places in history for MIT and itscontemporaries on the West Coast, includingDouglas Engelbart’s Augmentation ResearchCenter, Xerox PARC, the Rand Corporation,and so on. The myth brings in IBM, not as dom-inant as it once was but as the chief advocate ofbatch processing, now meeting challenges withvarying success. The myth gives an appropriateplace to the minicomputer companies like DECand Data General, who I always felt were slight-ed by historians who focused on IBM and theBUNCH (Burroughs, Univac, NCR, ControlData, and Honeywell). The myth provides aplace for software, not so much the history ofprogramming languages like Fortran or Cobol,but operating systems and languages like Unix,C, the Macintosh interface, and Windows, aswell as application programs like Lotus 1-2-3.

Such a focus is not perfect, and that brings meback to the books and articles I cited. Why doesthe notion that the Xerox Alto was the first PCnot sound quite right? One clue appears inGarfinkel’s book about MIT’s Lab for ComputerScience. In the preface, Garfinkel mentions over-hearing a conversation at a Cambridge coffeeshop, in which two “businessmen” discuss thepioneering role of Bill Gates, Microsoft’s founderand head, in inventing the Windows operatingsystem and bringing it to the market. ToGarfinkel, this conversation revealed how littlepeople know of the scientific underpinnings formodern computing—even when the researchwas conducted “just a few miles from where they[the businessmen] were sitting.”17

The question of just where and by whomthe Windows interface was invented is one ofthe big questions implied by the creation myth,and Garfinkel has every right to be concernedabout setting the record straight. But in doingso, he, like those who give similar emphasis toXerox PARC, is missing something. Other than

being overheard (and dismissed) in the coffeeshop conversation, Bill Gates does not appearin Garfinkel’s story. But Gates and Paul Allen,the founders of Microsoft, were in Cambridgein the early 1970s, when the Laboratory forComputer Science was being established. Gateswas a student at Harvard; Allen was working atHoneywell a few miles away. The two youngmen were dedicated to bringing about a trans-formation of computing, just like those atProject MAC. They may have known of thework going on down the road at MIT. Perhapsthey sat in the same coffee shop whereGarfinkel later overheard the two businessmen.Gates and Allan surveyed the state of comput-ing in 1975 and decided to leave Cambridgeand go where computing’s future was beingcreated: Albuquerque, New Mexico, where theAltair, a PC based on an Intel processor, wasbeing assembled and sold. The personal com-puter was not part of Project MAC’s vision, butit was the center of Gates’s and Allen’s vision,and of their company, “Micro-soft” [sic].

Where does this story fit? We cannot ignoreMicrosoft or Intel in relating the history of mod-ern computing, but we might have to admit thatneither they, nor companies like Dell andCompaq, owe much to Xerox PARC, or MIT forthat matter (at least not directly). The desire towrite history from the present backwards, cou-pled with Microsoft’s and Intel’s powerful pub-lic relations efforts, plus the withdrawal of EdRoberts (Altair’s inventor) from this debate, haveled many to conclude that Ed Roberts, like JohnMauchly before him, did nothing. Furthermore,this would imply that Intel would have becomethe dominant architect of computing anyway,and the same would be true for Microsoft’s dom-ination of software. Doug Engelbart, who was atthe forefront of building networked, interactivecomputing, once told of how he regarded thePC phenomenon with horror, as it went againsteverything he was trying to do with comput-ers.18 Historians would be well advised to con-sider the difficulties in networking early PCsbased on the Intel 8088 processor, and howsome advocates regarded networking as anevil—a reminder of the mainframe attitudes thatthey were trying to break away from.

Take another lookHow important this all was is unknown, but

we need to find out. Today’s scholars in comput-ing’s history could construct a narrative recon-ciling these events, just as the first generation ofhistorians reconciled the convergent storiesbehind the stored-program, electronic digitalcomputer. Developing this story might damage

54 IEEE Annals of the History of Computing

A View from 20 Years

Page 7: "A View From Twenty Years as a Historian of Computing"

October–December 2001 55

the place in history of MIT, time sharing, ARPA,and Xerox, or it might not. But the PC’s inven-tion must be addressed. In spite of all that hasbeen written about Gates, Microsoft, and theinvention of the PC, historians have not yetmanaged to make a coherent narrative that rec-onciles most of what has happened in the past20 years. With the structure I have outlinedabove, we can achieve that. Doing so may not beenough to ensure that the Annals, or the study ofthe history of computing as a separate discipline,will last another 20 years. Perhaps the digitalcomputer will take its place alongside the four-cycle gasoline engine—or the washing machine.Or perhaps historians will abandon a focus onhardware (and associated programming lan-guages and systems) and focus their scholarshipon the culture of the World Wide Web andcyberspace. If computing becomes a thread inthe fabric of daily life and therefore invisible, ascurrent researchers at the MIT Media Lab claimwill happen, it will not be easy to maintain ajournal. Regardless of how computing evolves,much remains to explain how computing devel-oped to its present stage, never mind to a hypo-thetical point of invisibility in the future. Thequestion of whether the subject was worthy ofstudy was answered affirmatively long ago. Thequestion of whether it will remain so should alsobe answered in the affirmative, as long as thepractitioners of its history provide a frameworkon which to tell this story.

References and notes1. B. Galler, “A Career Interview with Bernie Galler,”

IEEE Annals of the History of Computing, vol. 23,no. 1, Jan.–Mar. 2001, p. 31.

2. P.E. Ceruzzi, Reckoners: The Prehistory of the Digi-tal Computer, from Relays to the Stored ProgramConcept, 1935–1945, Greenwood Press,Westport, Conn., 1983.

3. B. Randell, The Origins of Digital Computers:Selected Papers, 2nd ed., 353, Springer-Verlag,Berlin, 1975.

4. J.A.N Lee, ed., “Special Section: Analog Comput-ers,” IEEE Annals of the History of Computing, vol.15, no. 2, Apr.–June 1993, pp. 8-52.

5. P.E. Ceruzzi, A History of Modern Computing, vol.2, MIT Press, Cambridge, Mass., 1998.

6. A. Newell, “Computer Science,” Science 157, 22Sept. 1967, pp. 1373-1374; also see H.A. Simon,Sciences of the Artificial, MIT Press, Cambridge,Mass., 1969.

7. S. Amarel, “Computer Science,” Encyclopedia ofComputer Science, A. Ralston, ed., Van Nostrand,New York, 1976, p. 318.

8. G. Moore, “Progress in Digital IntegratedElectronics,” Technical Digest, IEEE Press, Piscat-away, N.J., 1975, p. 11-13.

9. D.J. De Solla Price, Science Since Babylon, 2nded., Yale University Press, New Haven, Conn.,1975, p. 169.

10. M. Kranzberg, Kranzberg Papers, 266, 301,1963; File: Proposal, MK to USAFI, Madison, Wis.;Nat’l Museum of American History Archives.

11. M. Kranzberg and C.W. Pursell Jr., eds., Technolo-gy in Western Civilization, vol. I, Oxford, NewYork, 1967, pp. vi-vii.

12. M. Kranzberg and C.W. Pursell Jr., eds., Technolo-gy in Western Civilization, vol. II, Oxford, NewYork, 1967, p. 3.

13. L. Press, “Before the Altair: The History of Person-al Computing,” Comm. ACM, vol. 36, no. 9,1993, pp. 27-33.

14. T.P. Hughes, Rescuing Prometheus, Pantheon,New York, 1998; P. Edwards, The Closed World:Computers and the Politics of Discourse in Cold WarAmerica, MIT Press, Cambridge, Mass., 1996;K.C. Redmond and T.M. Smith, From Whirlwind toMITRE: The R&D Story of the SAGE Air DefenseComputer, MIT Press, Cambridge, Mass., 2000.

15. S. Garfinkel, Architects of the Information Society:Thirty-Five Years of the Laboratory for Computer Sci-ence at MIT, MIT Press, Cambridge, Mass., 1999.

16. M.S. Mahoney, “The History of Computing in theHistory of Technology,” Annals of the History ofComputing, vol. 10, no. 2, 1988, pp. 113-125.

17. S. Garfinkel, Architects of the Information Society:Thirty-Five Years of the Laboratory for ComputerScience at MIT.

18. D. Engelbart, personal communication, 1 May1998.

Paul E. Ceruzzi is curator ofaerospace electronics and com-puting at the SmithsonianInstitution’s National Air andSpace Museum in Washington,D.C. He recently published AHistory of Modern Computing(MIT Press, Cambridge, Mass.).

Currently, he is working on a research project to doc-ument the history of systems engineering firms locat-ed in the vicinity of Tysons Corner, Virginia.

Readers may contact Paul Ceruzzi at [email protected].

For further information on this or any other com-puting topic, please visit our Digital Library athttp://computer.org/publications/dlib.