Home | About | Collections | Stories | Help! | News & Links | Friends | Lets Talk! | Events & Visiting | Search

DigiBarn Friends


Jef Raskin's writings:
"Holes in the Histories"


HOLES IN THE HISTORIES

In Slaughterhouse 5, Vonnegut’s protagonist, Pilgrim, is in a hospital ward adjacent to a writer discussing a book he’s working on about the bombing of Dresden in World War II. Pilgrim offers information, telling him that he was in Dresden at the time. The writer is uninterested in facts that might upset his planned book and suggests that Pilgrim write his own version. I have often been in Pilgrim’s place in discussions with authors writing on events in which I have played a role. I know that merely having been there, or even having been a principal, does not give one a privileged portal to the truth. Though since Copernicus the motions of our planet are no longer seen as geocentric, our individual worlds egocentrically revolve around ourselves. Indeed, we cannot see events except from inside our own minds though we can attempt to provide tests of our recollection via various lines of physical and testimonial evidence.

The popular media has a poor track record of presenting the recent history of technology, at least with regard to the commercial side of the story of how human-computer interfaces came to be the way they are. I wondered where the incorrect information had come from and why the authors didn’t pick up a phone and call the people involved—it’s not as if this is ancient history and all the principals and their relatives are long dead (though time is running short in this regard). Had the reporter’s quest for truth and the historian’s thirst for facts evaporated? Before looking at the reasons for the inaccuracies, I should perhaps first explain how I happen to be in a position to write somewhat authoritatively on this topic:

In the spring of 1979 I went to the Chairman of the Board of Directors of Apple, Mike Markkula, and proposed that Apple build a new kind of computer. It was to be inexpensive; have a small footprint; use a built-in, graphics-based screen; and—my most heretical point— it would be based on human factors considerations rather than driven by whatever was hottest in electronic technology at the moment. My name for this project was "Macintosh". (see also "The Genesis and History of the Macintosh Project").

Having introduced the concept of human interface development as a discipline at Apple, and being one of the early observers of the work at Xerox’s famous Palo Alto Research Center (PARC), I have subsequently been astonished, amazed, disappointed, and at times upset by what I’ve read. Even the prestigious Harvard Business Review got the basic facts of the origin of the Macintosh interface nearly backwards. This is especially distressing since Harvard teaches business partly through case studies. Fed fictions as facts, it is not unreasonable to fear that the students’ understanding may suffer accordingly. Occasionally, I have written a letter to correct one or another error that appeared in print. Sometimes these letters had the effect of influencing future articles, sometimes they disappeared without a trace, and once or twice they were loudly refuted by people who hadn’t been there and had no documentation behind their theories.

There have been many books on the history of Apple, some by or about its major players, and in 1994, the 10th anniversary of the commercial introduction of the Macintosh and the 15th anniversary of the project’s inception, a new rash (in three senses of the word: a plethora; hasty; a pox) of books and articles appeared. Where these works discussed events where I was not a participant I found them interesting and credible until it occurred to me that if the sections where I knew what had happened were wrong (sometimes wildly so) then why should I expect that the rest was accurate? My own collection of contemporaneous drawings, memos, and letters often allows me to fix a date or assign credit accurately; but reporters and writers have not asked to search through this material—or probably most of that in the hands of others—for themselves. A number of times I have offered free access, but to no avail.

There are a number of reasons the historical accuracy has been so bad, and they range from the subtle to the banal. Some writers take a cavalier attitude toward history while others indulge in the crass opportunism that explicitly eschews facts if they would either take an effort to check out or interfere with the attractiveness of the story line in terms of possible movie or TV rights.

SECONDARY SOURCES

Let’s start with an elementary technique of serious historians: using primary sources whenever possible. Looking at the references in the two most recent books, Levy’s Insanely Great and Stross’s Steve Jobs and the Next Big Thing, one observes that they are almost all secondary, taken from earlier books, magazine articles, or newspaper accounts. Rarely are original documents cited; in-depth interviews with participants are only a bit more common. Replication of errors made a decade ago cover the pages like an algal bloom. The more books of this sort that are published, the more "sources" one can find that agree on a "fact." Eventually the fabrication becomes indisputable on the basis that "everybody says so. Look, I have seven references on it." Uncritical use of secondary sources is a major problem. But searching through tens of thousands of pages of documents is hard, time-consuming work, and conducting repeated interviews to sort out inconsistencies is a bother. The overwhelming impression one gets is that work and bother are off-putting to today’s authors—and as we shall soon see, some freely admit this.

OVERSIMPLIFICATION

Thomas Morton was writing of the history of science (in American Scientist, Vol. 82, pg. 182) but his observation fits technology as well: "Historians often reason from the internal evidence...but [in science and technology] a parallelism between two accounts cannot reliably be used to infer that one influenced another (or even that they were influenced by a common source.)" It is easier to attribute every invention to one person or organization rather than have to untangle the unwieldy web of the way things happen. If the same idea crops up in two places, it is easiest to assume that one must have taken it from the other. Combine this kind of simplification with an avoidance of primary sources and you can wander far from the truth. For example, in Stross’s book he speaks of Xerox’s Palo Alto Research Center (PARC), "... like Old Testament genealogy, every important development in personal computers traces back to this same single source ." To be sure, PARC’s influence was broad, deep, and beneficial, but it was by no means the "single source" of "every important development." Stross’s blanket claim ignores the influence of Sutherland’s far earlier Sketchpad system, Englebart’s prior conception of the mouse and windows, that the all-important invention of the microprocessor itself did not take place at PARC, and that the people who created the early personal computers (Apple I, SOL, Poly 88, Heath H8, IMSAI, Altair, PET, etc.) generally knew nothing of and took nothing from PARC. Many significant examples of influential software that did not derive from PARC’s work, such as the systems written by Bill Gates, Gary Kildall, and Steve Wozniak also come to mind.

Strangely, by misattributing everything to PARC, the true contribution of PARC (insofar as we can evaluate it at such a small historical distance) is also diminished. A blanket "everything" often leaves the impression that what you see on the Mac and Windows is the sum of what PARC did. But the people at PARC have done much more than that, not only with regard to interfaces, but in many other independent and collateral areas of computer science, and they continue to do significant and pioneering work.

I can give an example from my own experience that combines a few of the sources of error discussed so far. In the late 1960’s I had come to realize the importance of what is now called WYSIWYG (What You See Is What You Get) displays. It would not do to have a limited set of fonts on a display and a different set of fonts on paper, for example. So, at a time when hardware character generators were universal for computer displays (they could usually generate one ugly font, with underlining, brightness reversal, and blinking as the sole typographic options), I published a proposal that argued that computers would have to be built without them. A few years later, in the early 70’s, the researchers at Xerox PARC (Palo Alto Research Center) came to the same conclusion independently, and started building computers embodying this idea. The workers at PARC also believed as I did that human usability was more important than the traditional concerns of computer science at the time: execution speed and the efficient use of memory. When I visited PARC shortly after it was opened, I found, for the first time, a computer-oriented community that was sympathetic to my work. On their part they found an outsider who did not have to be convinced that what they were doing was important or headed in the right direction. If Stross or Levy had gone back and read the works I had written before PARC was founded, or even interviewed the people I had known at PARC, they would have learned that many of the Mac’s key concepts had had an independent genesis.

SLOPPY SCHOLARSHIP

Another problem with books on the history of Silicon Valley is a dearth of simple facts checking. Jeffrey Young’s book Steve Jobs, published in 1988, is one of a number that not only share the same flaws as the books I’ve already mentioned but is especially weak on details. My copy’s margins are full of comments such as "No," "False," and "Not quite." I found myself inserting the names of the actual people involved in a number of places. Even easy-to-check details are flubbed, the go-go-dancer-and-poet-turned computer maven Bana Witt becomes "Bana Whitt" (she deserves a book of her own). Young makes the truly absurd claim that I "saw no need for graphics," in the Macintosh product and so forth. Some books are better than others in this regard (the Time-Life series on personal computers is one of the better ones, Owen Linzmayer’s The Mac Bathroom Reader is significantly better than the others), but it is clear that the editors, even at such established companies as Viking; Scott Foresman and Co.; Harper & Row; and Basic Books, give little weight to accuracy of detail. John Sculley’s book, Odyssey, (written with John A. Byrne) says that I was a "programmer" at Apple; I held many positions at Apple, but programmer was never one of them. I assume that I haven’t been singled out for inaccurate treatment and that an equal percentage of errors apply to other people and events.

DELIBERATE MISREPRESENTATION

Another cause for inaccuracy is the deliberate misleading of reporters, coupled with some reporters’ tendency to believe an apparently sincere and/or famous source. Levy’s book gives prominent thanks to Apple’s PR department, which learned the history of the Mac from Steve Jobs, whose well-deserved sobriquet at Apple (and later at NeXT) was "reality distortion field." Many times I had seen him baldly tell a lie to suppliers, reporters, employees, investors, and to me; Stross’s book provides many examples of this. When caught, Jobs’s tactic was to apologize profusely and appear contrite; then he’d do it again. His charm and apparent sincerity took in nearly everybody he dealt with, even after they’d been burnt a few times. For those who didn’t know him he seemed utterly credible. In his defense it should be pointed out that some reality distortion is necessary when you are pioneering: when I am conveying my vision of the future I create a non-existent world in the minds of listeners and try to convince them that it is desirable and even inevitable. I’m pretty good at this, but Jobs is a master, unconstrained by "maybe" and "probably." His attractive creation-myth—swallowed whole by susceptible reporters—wherein Apple’s computers were invented exclusively by college drop-outs and intuitive engineers flying by the seats of their pants became legend. To hear him tell it, the Macintosh had practically been born, homespun, in Abe Lincoln’s log cabin. That it had been spawned by an ex-professor and computer-center director with an advanced degree in computer science would have blown the myth away. A good story will often beat out the dull facts into print.

For example, after Byte Magazine published the "official" version of the creation of the Mac as a cover story in 1984, two enterprising reporters (John Markoff and Ezra Shapiro), acting partly on my comments to them about that article, interviewed the actual crew that started the Mac. The follow-up article was buried toward the back of the magazine, under the weak title "Macintosh's Other Designers." It received, predictably, little attention.

THE HALO EFFECT

This effect causes every invention to be attributed to the leader, most charismatic, or currently most newsworthy member of a group. For example, before Steve Jobs’s fumbling at NeXT exposed his weaknesses, he was usually credited with having invented the Macintosh. As his star was declining and NeXT beat one strategic retreat after another, General Magic—cofounded by Bill Atkinson and Andy Hertzfeld who had both worked on the first Mac—was announcing its first product amidst much hoopla. Thus I found, in the Dec 27 1993 / Jan 3 1994 issue of InfoWorld a story erroneously hailing Bill Atkinson and Andy Hertzfeld as the creators of the original Macintosh. As John Sculley (after leaving Apple) was ending his brief tenure as CEO of Spectrum under notorious circumstances, a National Public Radio report incorrectly described him—instead of Jobs and Woz (Steve Wozniak)—as the founder of Apple.

The halo effect also assigns superhuman abilities to the famous, often overcoming a reporter’s credulity: Jeffrey Young writes of the first time that Steve Jobs (along with Atkinson and others) saw the work done at PARC. "Atkinson and the others were asking Tesler questions, one after the other. Tesler was quoted as saying, ‘What impressed me was that their questions were better than any I had heard in the seven years I had been at Xerox... Their questions showed that they understood the implications and the subtleties...’ " But Young did not ask why they had such a high level and rapid understanding that no other mortals could achieve; the halo effect had blinded him. The real reason for their near-instantaneous grasp is that they had been carefully prepared for the visit. I had repeatedly explained the details and rationale of the work at PARC to Atkinson, Jobs and others. PARCs philosophy was therefore well known at Apple. Tesler didn’t know about this background, wasn’t told, and so was bowled over.

GOING BY APPEARANCES

Prior to the coming of the microprocessor, the computer industry (exemplified by IBM) was a bastion of corporate formality. When I was invited in the 1960’s to give a talk to IBM executives about new directions in computer applications I chose to go tieless in blue jeans and flannel shirt since I thought this would lend some shock value to my presentation. The talk went well, but when I was invited to join my host for lunch, I was stopped at the door to the cafeteria by a uniformed IBM employee. He said, "You can’t come in, sir, without jacket and tie."

My hosts had long-since forgotten the rule; nobody even thought of working or visiting IBM in attire such as mine. We had no extra tie or jacket, and were at an impasse until someone went ahead, took off his jacket and tie and tossed them back to me. Apparently, the rule was that you could not enter without a tie, but there was no rule about taking it off once inside. Dress codes were then typical of the computer establishment, so when some of the microcomputer companies started up, they not only abandoned the technical methods of the big computer companies but made a point of also throwing out the trappings. This was especially true at Apple. Properly-dressed reporters who visited in the early days, accustomed as they were to traditional computer companies, found the un-computer-company style at least as remarkable as the products. Our penchant for odd dress and irreverent play (frisbees in the hallways and the like) conveyed the spirit of the products and obscured the serious work going on in the cubicles. Our then-unusual lifestyle made good PR that could reach audiences otherwise uninterested in computers, and gave the products an aura of fun and novelty rather than work and stodginess. This was great marketing, but it was also a smoke screen, one that has continued to befuddle reporters to this day. Many continue to take a penchant for play , eccentric mannerisms, and eclectic dress as a disinclination to do hard and serious work.

THE IRRELEVANCE OF TRUTH

The last cause for inaccuracy that I will take up is an overcasual attitude and a kind of arrogance on the part of some writers. It is rare to get an explicit admission of this, but I must tip my hat to Robert Cringely, who writes a delightful column that appears weekly in InfoWorld, a computer trade journal. In his book on Silicon Valley events, Accidental Empires, he has the Mac and Lisa (an Apple computer that didn’t make it commercially) projects being created by Steve Jobs after Jobs made the visit to PARC "in 1980" and came back all aglow with inspiration.

I emailed to Cringely to point out that his book—like those of a number of other authors—was wrong; Jobs had indeed made a visit in December, 1979 but the Mac project was proposed in the spring and was officially started in September, 1979. In other words, the project was well under way before the event that was supposed to have inspired it took place. Cringely was unabashed. He emailed back: "As for all the business of what project started when, whether Lisa started before or after Steve visited PARC, whether the Mac had already begun or not, well I don’t think that it really matters very much. My attempt was to EXPLAIN (I say that at the front of the book), not to be a historian."

How an author can hope to explain what happened if he doesn’t even know what happened eludes me.
Later I discovered that the people he interviewed were mostly Apple’s PARC expatriates, their association with Apple began after the Mac was well under way. Thus they could only tell him about the development of the ideas at PARC and about the work on Lisa (they were not then associated with the Macintosh project) after some time in 1980—that is after Apple was committed to the basic direction the Mac group had already established. Not terribly aware of that work, they related what they saw only to what they knew from PARC.

It’s not only books, of course, but other mass-media that have presented a confused view. The PBS special on the history of computers made the same mistake of attributing the genesis of the Mac to Jobs’ visit to PARC. When I sent the correct information to Jon Palfreman, its producer at WGBH, he replied, "The part of the program you are referring to comes at the end of a lengthy segment about the highly innovative work done at Xerox PARC. This section was based on extensive interviews with Alan Kay, Bob Taylor and Larry Tesler. The purpose was to show that the key concepts of interface design which today are a feature of most PCs (if you count Windows) were first discussed at Xerox PARC. When those ideas were embodied in a relatively affordable machine—the Macintosh—they began to change the world of personal computing. I was aware of your key role in the Macintosh project, and indeed of the contribution of people who developed Lisa. My aim in this particular program wasn't to detail the history of Apple but to show how the key interface ideas found their way into consumer PCs."

His excuse sounds much like Cringely’s, for he cannot "show how the key interface ideas found their way into consumer PCs" without detailing the history of Apple, which is where it happened. And, of course, some of the key concepts had already been discussed prior to the founding of PARC. Errors of this sort force us to wonder about the accuracy of the rest of the series.

WHAT’S MISSING

The years of study, thinking, and experimentation by many talented people on the Macintosh project—and elsewhere—have gone largely unreported, though they led to the breakthroughs that made the Macintosh and the systems that have been built since its introduction so much of an improvement over what went before. Against this complex reality we have the powerful mythological image of Jobs drinking from a Well Of All Knowledge, having an "aha!" experience and coming back at full cry to Apple to create a fantastic project. This scenario is familiar—it parallels that of Archimedes jumping naked out of his bath crying "Eureka!" and a dozen other stories. It inverts Edison’s observation that "Genius is one percent inspiration and ninety-nine percent perspiration." When Cringely reported in his InfoWorld column for 4 April 1994 that his book was being made into a TV miniseries, he crowed that it represented "the ultimate triumph of style over substance" One can admire his candor while deploring his scholarship and envying his earnings. 2,400 years ago the historian Thucydides had a higher calling, "My history has been composed to be an everlasting possession, not the showpiece of an hour." Today we get shows that air for an hour.

Along with oversimplification, using secondary sources, being weak on background, a lack of attention to detail, getting taken in by the halo effect, and a general attitude problem among some of the people who have reported on the history of technology, there has been a belief in things happening by magic. Intense intellectual effort and in-depth technical expertise vanish to be replaced by tales of inspiration and guesswork. The legend tells us that scholarship and hard work are not necessary in order to usher in a new age. Yet the same legends speak with awe of the 80+ hour-per-week grind of the faithful, driven employees. What were they doing all those hours? Drop out, turn on, assume the lotus position, eat jelly beans, have pizza-and-beer parties and fortune will surely follow, sing the storytellers. The truth lies elsewhere.

APPENDIX

One of the most reliable sources of information on who did what and when is in the "Book of Macintosh," a collection of documents written by the members of the Mac team for the first few years of the project. Here is the beginning of one that I only recently (December 1994) ran across when a researcher sent a copy to me. The date alone suffices to settle the question of whether the Mac project was started after Mr. Jobs’s later trip to Xerox PARC. This particular document is noteworthy in that it shows that Apple was still debating internally whether personal computers would be useful in the home. Also, it was not until a decade and a half later that Apple finally decided to create its own online service, an effort that I see as being rather late on the scene.

And I will admit to some pride in having foreseen, with reasonable accuracy, the applications of such a service. The following excerpt is unedited, even the embarrassing errors of spelling left unfixed. For this excerpt see "The Macintosh Project - Apple Computer Network" by Jef Raskin (Fall 1979).

For reference also see "The Genesis and History of the Macintosh Project" by Jef Raskin (1981).

See Jef's other writings on this topic here

See our histories of the Desktop Metaphor (including Xerox and PARC)


Back to Jef Raskin's Pages at the Digibarn

Please send site comments to our Webmaster.
Please see our notices about the content of this site and its usage.
(cc) 1998- Digibarn Computer Museum, some rights reserved under this Creative Commons license.