Part 1
Retrospective: Hardware on a Breadboard

Copyright ©2000 by Paul Niquette, all rights reserved.

n recent recent years, I have thought back to 1953 -- and beyond, to the 1940s -- probing my memory for the earliest relevant recollections. First to mind come all those electronic projects I worked on as a kid, including crystal sets and code-practice oscillators to earn Boy Scout merit badges. Later, I spent hours poring over plans clipped out of Popular Science. One summer, I built a superheterodyne short-wave receiver, using war-surplus tubes and transformers. At night, when atmospheric conditions were just right, I was able to pick up the BBC and other far away stations speaking strange languages.

My father was an electronic technician during World War II.  He watched my juvenile struggles. One problem I had early on was to bring solder and iron and wire together -- with only two hands. He taught me the art of sculpting pre-tinned leads from twisted strands dipped in greasy flux and gently touched to the iron tip, thence to be drawn from the acrid smoke all silvery and ready for conductive attachment. My dad left me alone, except to grin later, with earphones pressed to his ears, as I tapped out C-Q ("seek you") on my home-made telegraph key.

My mother donated a cracked breadboard to serve as a substrate for my screwed-down treasures. That's right, a literal "breadboard," which by coincidence, let's assume, continues into the present century as the universal term-of-art for prototypes.

The workbench in our garage held stacks of coffee cans and shoe-boxes filled with screws and nuts, banana jacks and clip-leads, wire and terminal strips -- a vast inventory of parts purchased with my lawn-mowing wages at the neighborhood hardware store. That's right, "hardware" store -- decades before Radio Shack and Federated.

lgebra was my favorite subject in high school. My instructor noticed I was turning in all the problems at the end of each chapter, not just the odd numbered ones that he had assigned. Mr. Russell (not Bertrand, I am quite sure) lent me a book on symbolic logic, which contained a chapter on "Boolean Algebra."

There was something wonderfully definite about truth and falsity, about premises and conclusions. Soon the "syllogism" reached my eyes.

In a matter of days, I became fascinated by the "valid deductive argument" much like -- well, much like I became fascinated with girls later on. In 1949, I cleared my workbench and set about to breadboard my first invention, a "syllogism prover," using rotary switches and rejected relays from bombsights and autopilots brought home by my dad.

My inspiration was a paper by Claude E. Shannon published in the late 1930s on symbolic logic and summarized in a retrospective that appeared in Science Digest, to which I owned a subscription and read monthly from cover to cover. The first version of my syllogism prover operated on dry cells and had four main parts.

  1. An eight position rotary switch was wired to a set of three relays. Each relay had multiple contacts capable of asserting and negating three Boolean variables X, Y, Z.
  2. A patch panel enabled the contacts of the relays to be wired in series and parallel to provide the respective Boolean operations of ANDing and ORing.
  3. Two relay solenoids were wired to the patch panel to receive logical signals such that their normally open contacts representing "premises" P1 and P2, which were wired in series, producing a TRUE signal only when both premises were TRUE.
  4. A third relay solenoid was wired to the patch panel to receive logical signals, and its normally closed contacts representing that the "conclusion" C was FALSE, which was wired to a lightbulb labeled "Invalid."
It might be noted in passing that the syllogism prover was by definition an "analog" computer, inasmuch as individual parts of the assemblage corresponded to specific parts of the problem being solved (a different problem required a different configuration of the physical components -- the "hardware"). That's the proper definition of an analog computer -- never mind the fact that the variables and operations inside were limited to two digital values, TRUE and FALSE.  In today's usage, "analog" has become synonymous with "continuous," "digital" has come to mean "discrete," and, thanks to advances in digital technology including software, "analog computers" have all but disappeared from the face of the planet.

There were no science fairs at my school, which is just as well. The syllogism prover took hours to set up (to "program" in the parlance of computers). The contraption would then be seen operated by a bespectacled nerd clicking the rotary switch and gazing at a small lightbulb on the panel, which for a "valid deductive argument" was supposed to stay extinguished. A real yawner. Even my dad shrugged when he saw it. Soon the syllogism prover was cannibalized to build a tick-tack-toe machine, which I never completed.

hus, as far back as the late forties, the term "hardware" came naturally to me as I pedaled my bicycle home from the neighborhood hardware store with a sack of components for my adolescent projects.  Moreover, in 1947, according to the Oxford English Dictionary, the word "hardware" was already being applied to computers as a synecdoche.

Today, we hear the expression "military hardware," which, for all I know, is gratuitous slang, an informal expression used to distinguish capital equipment from personnel and consumable supplies. Manufacturers might do the same with "factory hardware," but they don't. Neither do construction contractors ("excavation hardware") nor cosmetologists ("hairdressing hardware").

Many people harbor the suspicion that for computer systems, the word "hardware" arose as a retrospective designation -- nomenclature made necessary in the sixties after the word 'software' became current.  If so, then "hardware" would be characterized as a "retronym," much like the word "ordinary"-- a term which became necessary to distinguish the high-wheel bicycles of the 1880s from those new-fangled "safety" bicycles with their chains and sprockets that became popular in the 1890s.

The practice of retroactive naming occurs in other realms ("World War I" replaced "The Great War" while "World War II" was raging in Europe) but not all (there is no film entitled "Rocky I").

However, as a term-of-art in computers, the word "hardware" was not a retronym, as confirmed by citations in the OED.    An adolescent subscriber to Science Digest might well have applied that terminology.

Now, I am not saying that the syllogism prover in 1949 sensitized me to the distinction between 'hardware' and 'software.' That came later -- four years later. Meanwhile, I had soldered up a permanently crafted set of resources, comprising battery, breadboard, and relays. Somewhat separately, the machine was equipped with a rat's nest of changeable logic in the form of jumpers on a plug-board (much like the earliest EAMs -- electrical accounting machines, come to think of it).

The mere fact that the logic could be changed did not seem to me in any sense "soft," even though it was far more convenient to go about foozling with jumpers and sockets than unsoldering and resoldering tinned wires on terminal strips.  No, I am confident in the recollection that the mental precursor for "softness" necessarily waited for personal experiences with another kind of changeability.

Four years later, I won a scholarship and entered UCLA's College of Engineering as a junior. In June of 1953, I landed a dream job on campus at the Institute for Transportation and Traffic Engineering (ITTE). That marked the beginning of two years filled with memorable experiences.

During the first week at ITTE, I met the SWAC.

0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
Next Part >>
Home Page
Softword: Table of Contents
Top of Article