Part 7
Question: Fair is Fair

Copyright ©2001 by Paul Niquette, all rights reserved.

air is fair.  The question posed regarding my challenger...

     Why did John Wilder Tukey go to his grave without actually claiming the invention of
     the word 'software' 42 years earlier?

...with some modification, applies to me.  Considering what is known today -- indeed, given what the world has known since the sixties about software, about its manifold impacts on every technology, about its power to shape modern life...

Why would the person who coined the word 'software'
wait a half century before claiming it?

The previous sentence deserves an exclamation point more than a question mark. I shall refer to it from this point onward as the why question.  Anyone called upon to judge the assertion I am making here is entitled to the best retrospective answer to the why question I can give. Answers plural. Permit me to frame them in decades.

1950s

he most critical interval for documenting my claim, of course, will necessarily be the 1950s. Alas, those youthful years blasted by in a blur of activities -- years filled with distractions and impediments both personal and professional, affording me abundantly plausible answers to the why question. Throughout that decade, I simply had no thought of 'software' as being anything to claim. Simple as that.

As described elsewhere, the hardware used in my high school projects with symbolic logic was limited to relay technology and programmed by patch panel. It was a discrete-value analog computer.  Until 1953, like most people at that time, I had never seen a stored-program digital computer -- a von Neumann Machine. Then, while in my junior year at UCLA, I was actually paid to program the SWAC.

It was in October of that year, that I coined the word 'software' more or less as a prank. I was 19 years old.  I never expected the term to be taken seriously. For the next two years, I used 'software' in dozens of speeches and lectures, accompanied necessarily by its definition.  Shrugs and smirks hardly provided an incentive to accept -- let alone seek -- any kind of credit for the word 'software'. There, for the record, is my earliest answer to the why question -- my own dismissive state-of-mind, not to mention frequently frivolous post-adolescent behavior (see sniglets).  Which is not to say, that I did not perceive -- even at those earliest moments -- the wonderous, world-changing power of software.  It's just that I thought the word 'program' was serviceable enough for serious speaking and writing.

The main body of my work at UCLA was in automobile crash injury research and radar speed meters, both a far cry from anything relevant to the why question. All publications that referred to the computer programs I wrote, of course, received attribution to the principal researchers (Dan Gerlough, Slade Hulbert, Harry Matthewson, Bob Brenner), who were warily disinclined to incorporate 'software' in any erudite writings that had been drafted by a mere undergraduate. Then too, from the very beginning, I regarded the word 'software' as a misnomer.  Even at the age of 19, I sensed the unique hardness of software (see Software Does Not Fail).  There is nothing created by mankind that is more enduring!

For the first two years after graduating from UCLA, I worked at Hughes Aircraft on secret projects, including radar sets, air-borne fire control systems, and guided missiles. All applied continuous-value analog computers. Still in my early twenties, I was not allowed to produce technical papers on computer hardware.  Few people indeed were then writing about software, which would have been called programming anyway.  That answers the why question for those years in terms of contemporaneous publications. I kept up my private studies on "finite automata" and their applications, though, and accumulated something of a local reputation. In 1956, I was contacted by State of California and received a contract to write the syllabus for the very first junior college course on computers, both analog and digital. The Department of Education awarded me an expedited credential, and I began teaching the course to SRO classrooms at LA Tech. Naturally, I spoke the word 'software' in my lectures but only informally, always as a term of distinction from 'hardware', giving the word 'program' the merits of invincibility.

My luncheon talks to Kiwanis or Lyons about "giant brains" were quoted occasionally in the local press, but I was careful to credit others with the wonderous inventions themselves and would never have sought recognition for the handful of words being appropriated to describe them ("you heard it here first, folks"), which is the most rational answer I can give to the why question for those occasions.

In 1957, Ray Jacobsen, then a division vice president at Thompson-Ramo-Wooldridge Products (TRWP, pronounced "twerp" -- happily renamed TRW Computers) was a student in my computer class, along with his marketing team comprising Lou Perillo, Ray Stanish, and Rigdon Currie.  After a dozen hours of observing my antics, Jacobsen hired me away from Hughes to teach programming and maintenance courses on the RW-300, a digital computer designed by space scientists for application in the earliest real-time process control systems.  These were the most serious of industrial environments, by the way. At TRWP, we were confidently programming primitive computers to run chemical factories, oil refineries, cement plants, and -- shudder -- nuclear reactors. Still in my mid-twenties, however, I continued to build a singularly unserious reputation. (see Debugging and Degaussing).

Eventually bored with the silly word 'software,' I used it sparingly myself, treating it more or less as slang, but then in the late fifties I began hearing it occasionally -- at first only from my students, later from customers and suppliers. I remember well that I found it amusing if not ironic. The word 'programming' continued in use, enjoying whatever respect the profession could garner for itself, while the word 'software' was often pejorated: occasionally 'software' was even treated as a verb ("we'll just have to software around that problem").

At the Eastern Joint Computer Conference in New York in 1958, I was alone in the TRW booth when a television news crew showed up.  I was standing alongside the company exhibit, the only one with a fully functioning computer system on display.  I gave an interview about the RW-300 and its innovative applications.  By coincidence, my father, whom I had not seen in a half-dozen years, stood grinning in the crowd, face iridescent.  After pointing out features of the hardware, I unabashedly described the company's "real-time programs as what we call 'software'."  Nearby shaking his head and wincing stood Lew Ward, head of public relations at TRW. I felt the smile disappear from my face. Thus, the answer to the why question, both then and later, included an element of jocularity on my part.  One interview I remember most vividly was for "Voice of America." Afterwards, as an act of bravado, I tormented the public relations staff by reporting that I had made the remark, "Only in the most backward countries are they replacing computers with people." I then hastened to reassure my wide-eyed colleagues that the on-air translation of my words into Serbo-Croatian would not have included the word 'software'. Whew.

In 1958 while still at TRW, I became a consultant to the newly founded Federal Aviation Administration (FAA) at the National Aviation Facility Experimental Center (NAFEC) in Atlantic City, New Jersey, doing fundamental research in air traffic control systems. There I wrote programs for the RW-300 and IBM 704. I gave four software-intensive papers on man-machine simulators, track-while-scan radars, and computer-driven displays. My work was published in the proceedings of the AIEE. The NAFEC research is covered in a separate memoir (see Squawk 1200).

With respect to those solemn writings, of course, the answer to the why question is retrospectively obvious to any fair-minded reader: claiming coinage for the word 'software' was simply not appropriate to the subject matter and would have achieved nothing beneficial for my professional aspirations. Nevertheless, I vividly recall at least one formal occasion during which I spoke the word 'software' in a conference at Pennsylvania State University.  Even at the podium, I did not think to attribute the word to myself (see Part 3).
 
 

1960s

or the early sixties, there is this one answer to the why question I might really like to offer -- that it would have been extremely immodest for me to claim the invention of a word with such rapidly widening currency. In all candor, however, my self-restraint came from an expectation that 'software' would soon fade from the lexicon of the industry, replaced, most likely, by "computer science."

My engineering work at that time comprised both hardware and software, including custom applications of the CDC-160A manufactured by Control Data and on-line stock-price delivery systems, trade name "Quotron," for the financial industry produced by Scantlin Electronics. I would hear the word 'software' occasionally, but it was not yet a fully accepted word in the computer field. Surely that is enough of an answer to the why question for this interval. But there is more, much more.

Real engineers designed and built hardware. I firmly recall that 'software' was regarded by most people as a sissy word, and managers treated programmers as second-class citizens.  They skulked about with their coding sheets and punched cards, each decidedly loathe to characterize his or her own endeavours with such a marginal term as -- ugh! -- 'software'.

In 1963, I joined Scientific Data Systems (SDS), one of the "Seven Dwarfs" -- manufacturers of computers, all scrambling for a share of the market owned 80% by "Big Blue" (IBM). At SDS, I got undeservedly promoted and took up the struggle against the stereotype of a good engineer becoming a bad manager.

Plunged into high-intensity, administrative assignments, I formed and managed several hardware engineering departments (peripheral controllers, manufacturing testers, memory systems, design automation); I was responsible for the development of computer-based industrial control products, systems for the man-in-space program, and custom equipment for computers. With the appearance of "monitors" and later "operating systems," software gradually emerged from its status as a necessary evil and became a sub-industry, with growing economic importance. As for the why question, during the early sixties, practitioners of software were widely scorned for their lack of discipline. After all, they persisted in scuffing about in sandals and unkempt long hair. Enough said.

IBM in 1964 introduced the immensely successful System/360 and in 1968 made the fateful decision to "unbundle" their support services -- read 'software' -- from the hardware pricing itself. Given IBM's dominance of the computer industry, these two events probably did more than anything else to establish the word 'software' in popular vocabularies.

The reader is here invited to observe that my predicament with respect to the why question and everything else in my life changed abruptly 1969.  For a record-setting price of a billion bucks, SDS was acquired by the Xerox Corporation, hardly a paragon of software technology. With aspirations to attack IBM head-on, Xerox gave SDS the name Xerox Data Systems (XDS), which then became a forlorn passenger in steerage aboard the Great Ship Duplicator.
 
 

1970s

lmost immediately after the acquisition, the giant hand of Xerox reached down and plucked me out of XDS in California and plugged me into a new career at corporate headquarters in Stamford, Connecticut.

Picture in your mind a nasal-retentive, unsophisticated engineer replete with pocket-protector and horn-rim glasses, recently sobered by horrendous adaptations to line management responsibilities, now suddenly transported into the politically nuanced world of corporate staff work and put in charge of a think tank, surrounded by a council of Ivy League authorities -- a priestly class of business futurists in three-piece suits and wingtips, who spoke in hushed tones and bowed to each other. I was not qualified to tie their shoes, but -- hey, I was their leader.

Concealing bewilderment in brashness, revealing impatience with unsubtlety, I quickly established my insufferable reputation as the "Coyote from the California." Eight years later I would be given the "lateral arabesque" then fired -- but I'm getting ahead of my story.

Known throughout the company, without affection, as "The Kitchen Cabinet," the seven of us commandeered the corporate jet fleet and conducted business reviews all over the world. Our meetings applied "The Rational Process," and we gave authoritative speeches and lectures in every university and business school you can name on "Financial Planning and Control," "The Architecture of Information," and "The Office of the Future."  My specialty was "The Post-Petroleum Age" and later "The Software Age."

In addition to our collective oversights (I preferred the expression "overview responsibilites") on behalf of the corporate management, The Kitchen Cabinet was held accountable for forecasting the business implications of emerging technologies. There were many of those in the siliconizing seventies -- plus software, of course, unquestionably the least understood topic at the time, in the deep-pile, hallowed hallways of Xerox and, along with microprocessor-based hardware, terribly unsettling to the status quo.  Xerography had long been dominated by an establishment steeped in photo-receptors and doctor-blades, stepping switches and relay logic, where "high-tech" meant a photo-diode instead of a mechanical lever.

Oh right, and it was a high-stakes game. The enterprise had aspirations to grow new revenues at the staggering rate of one billion dollars per year per year! The repetition at the end of the previous sentence is intended and so is the exclamation point.

In retrospect, it is fair to say that The Kitchen Cabinet fulfilled its mandate.  We delivered closely reasoned insights about the future and sound guidance for scaling mountains of emerging business opportunities, which were, alas, nervously made into molehills by near-term financial mentalities.  Don't get me started.

Too late. Most relevant to the present subject were the visionary concepts my colleagues and I championed at the Palo Alto Research Center (known worldwide even today as PARC), which included several transcendent innovations: the mouse and the graphical user interface (GUI), Ethernet and peer-to-peer communication protocols. These Xerox inventions and others were leaked -- indeed given away, as the record shows.

A parade of entrepreneurs visited PARC through the '70s.  Among them were the two Steves -- Jobs and Wozniac -- who founded a computer hardware company incongruously named after a fruit, which went on to exploit these technologies in commercial products.  Oh right, and another visitor came by for a tour of PARC, a soft-spoken, bespectacled chap named Gates, who went on to found a company that actually tried to sell software products separately from hardware.
No matter.  PARC's magnificent developments were dismissed repeatedly by Xerox financial managers for not meeting bottom-line business criteria relevant to reprographics (see Thicket in the Bilge). Besides, there was all that talk about -- well, software.  For a landmark review of these events see Fumbling the Future: How Xerox invented then ignored the first personal computer by Douglas K. Smith and Robert C. Alexander, iUniverse.com, Inc. 1999.

To the chagrin of both IBM and Xerox, the personal computer took center stage.  Moreover, for Xerox, networking PCs together was only going to impact the market for reprographics, which was exploiting the worldwide paper deluge.   With a corporate sigh, one might imagine, IBM appropriated Charlie Chaplin and in brilliant campaigns preserved their primacy in computers, this despite the impudent intrusions of Apple.  Meanwhile, Xerox indulged in denial, a perilous policy in technology.  My heavily colloquialized business advice was "If you ain't got it, you gotta get it, or you gonna get got by it!"  Corporate budgeteers with xerographic toner under their fingernails rejoindered with, "Apart from smoke signals, no form of communications has ever been displaced by a successor."

If you can envision the likely reception at Xerox for any publicized claim by me for inventing what had become a politically incorrect word, then you have your answer to the why question for most of that decade.

1980s

ropped out of the Xerox stratosphere, I parachuted into a raging battlefield of high-risk entrepreneurial adventures, including a disk-file manufacturing start-up (NIS), a diversified patent licensing service (APT), a gaming machine development company (G/N), a video production house specializing in animation and special effects (VCS).   Don't bother to look them up. Suffice it to say, I paid my tuition for on-the-job training in capital formation.

With my personal resources depleted, I eagerly accepted a position as chief engineer at Computer Automation, where I took on the responsibility for leading 130 professionals in the "Naked Mini" division. Glad to have a salary, I was also especially pleased to be returning to my roots -- computer hardware. But everything had changed. Where were the "main-frames"? The machines I had known at XDS were now called by the retronym "maxi-computers," inasmuch as their successors were nicknamed "mini-computers." By the early 1980s, they were already being overtaken especially in the control market first by "single-board computers" then by "micro-processors" and finally by "microcomputers." Imagine that, a whole computer on a chip.

My short absence from fast-pace developments in computers put me in mind of a Rip Van Winkle, rubbing his eyes and shaking his head, bedazzled by "buttons and bottles." Hardware, though, was not my most difficult challenge. Before I had found the lavatory at Computer Automation, I was confronted by a decidedly transformed development world now dominated by -- ta-dah! -- 'software engineering'. That's right, I was suddenly surrounded by persons solemnly referring to themselves as 'software engineers'.

The origin of the word 'software' was far from my mind by then. Besides, what was I going to do? -- ask one of my software engineering managers, "Ever wonder where the word 'software' came from?"  Doing so would merely subject me to the why question and the disclosure that, despite software's momentous societal impacts, I have regarded the word itself as, in successive stages, unserious, unenduring, effete, and -- bite your tongue, a misnomer.  "Excuse me?"

Later I joined American Automation as VP of Marketing, where I labored for seven years in the rapidly advancing, software-rich world of "development systems" for embedded microprocessors. Throughout the 1980s, then, I was thoroughly habituated to neglecting the subject of this story, thus adding another decade's answer to the why question.

1990s

e come now to the last decade of the century -- the millennium, in fact, remember? Gradually there arose all those mystical decimalizings that infected people with collective expectations.  Something more than the millennium was thought by some to be coming to an end on New Years Eve, 1999.  In my case, however, there would be no apocolyptic nonsense looming, for I had begun a whole new career in 1991. In railroading.

Trains, people!  I found a new technological religion and became baptized. For me it would mean nothing less than total immersion in a sacred river of safety critical engineering principles -- discovering, learning, and then influencing automatic train control systems.

The railroading industry was overdue for major innovations, and like every other realm of life in the 1990s, the trains of the world would experience radical changes and immeasurable benefits attributable to silicon and software. Thus, during what must be called my senior years, I have been deeply grateful for the opportunity to catch yet another technological wave. There would be, as the saying goes, new stories to tell my grandchildren.

The late Tom Sullivan ranked among the world renowned authorities in railroad signaling and control. I first met him at his office in New York City Transit in 1992. He became my mentor. Later he moved to California and became my best friend.

At an Equinox party in his hillside home, Tom Sullivan reminded me of something I had not thought about for a long time when he casually introduced me to his guests.

"Paul here is the guy who invented the word 'software'."

Excuse me.  I think I have told all this before.
At that moment and for the first time, I became aware of a retrospective question...

Why would the person who coined the word 'software'
wait a half century before claiming it?

Sitting here fully alive in The Software Age, all my answers seem lame.

Nevertheless, be advised, I did not go to my grave without claiming the invention of the word 'software'.  So there!