ronic,
isn't
it. Everybody has a memory, but nobody can explain how
it works.
Memories in brains differ from computer memories. Most
people finally understand
that. Brains learn, for one thing. You don't load them,
they learn. They
forget, too, even when the power stays on. Information
inside a human memory
fades away over a period of time. Not all information,
though. Everybody
has -- well, memories. Fond or otherwise.
Computers find information inside their
memories mostly
by addressing, sometimes by scanning, never by
thinking. Brains work by
association, we think. Units of information are
coupled together by invisible,
elastic bonds. There are dynamic processes at work,
too. Circulating within
a baffling neuronic structure are electrochemical
impulses linking one
idea to another -- by relevance or by common attribute
or by whatever it
is that nourishes familiarity. Brains recall a memory
by consciously stimulating
the content of the memory itself, or part of it.
Stirring things up in
there somehow. The recalling process reinforces the
content, selectively
preserving the memory.
-
Hence commercials and the marketing
concept 'share-of-mind.'
Most memories are formulated from sensory
information. Recollections,
too, are often stimulated from outside. Three sense
organs (eye, ear, and
nose) are especially influential in accumulating
memories and recalling
them.
-
A face, a song, a whiff can set off an
avalanche of associations.
Thus, the invisible, elastic bonds
transcend the boundaries
of the brain itself forming 'extrasomatic' (outside the
body) links to
other brains. Individual thinking has little
sovereignty. People depend
on Socialized Memory.
rains
work fast. We can recognize a face quicker than any
computer. That will
probably be true until well into the 21st century.
With all their 'megaflops,'
computers cannot process visual information fast
enough to catch a fly
ball -- cannot even tell a left hand from a right hand
at a glance. Computers
have an awful time recognizing individual words
embedded in continuous speech
-- especially accented speech, heard over a noisy
telephone line -- in
realtime.
Our central nervous systems operate at
a speed sufficient
to keep us out of most kinds of trouble. But, when it
comes to capacity,
human memories, like everything else, are limited.
Along comes writing,
which makes for extrasomatic storage of information.
Reading puts it back
into a brain for processing. That takes time, though.
Sometimes too much.
-
You can't expect your audience to be
looking things up during
your comedy routine (see Cultured
Laughter).
-
You can't be reading a flight manual
while landing a plane.
-
Nor a rulebook while umpiring.
As a practical matter, some items of
information have to
be kept handy within our brains. Others can be left
outside in books. Or
in computers, come to think of it -- and now at far-away
websites. Computers,
not surprisingly, have the same problem.
Only persons who have been living in a
cave since the
middle of the Twentieth Century would not know that
computers have various
kinds of memories. To describe three: First there is
ROM, for read-only
memory. ROM contains information which was built into
the machine at the
factory. ROM provides the computer primitive
capabilities -- unchanging
procedures for interpreting the keyboard, for shaping
the characters on
the screen, things like that. Human ROM, one might
say, gave us two things...
-
the sucking instinct and
-
the fear of loud noises.
The
computer has a second kind of memory, called RAM, for
random-access memory.
This is most commonly the working memory into which the
machine can electronically
write variable data. Fast and transient, RAM is most
like our own memory,
although, as has already been said, not much.
By the way, 'random' does not really
mean 'without order'
here. It means that the access time for any item in
memory is the same
as that for any other item, regardless of the order of
accessing. A distinction
worth noting, as we shall see.
Finally, there are the disks: 'floppies'
(an embarrassing
term that was first use to distinguish their flexible
plastic media from
the earlier rigid devices) and 'hard' disks (an equally
embarrassing term
that would be unnecessary had not someone coined
'floppy'). The typical
floppy holds a megabyte or two of information; a half a
dozen of them would
hold the King James Version of the Bible; it takes
hundreds to fill a typical
hard disk. Enter the CD-Rom with the ability to replace
stacks of floppies.
Capacious indeed are disks -- but slow.
Not as slow as
looking things up in a book, but hundreds of times
slower than RAM or ROM.
hen
the word 'cache' was first applied to memory, it needed
to be pronounced
in seven syllables: "cash, spelled c-a-c-h-e." The word
was borrowed from
the French cacher, to hide, and means 'hiding
place,' a term familiar
enough to explorers or pioneers for they used the word
to describe the
storage of provisions in the wilderness. {SideBar}
Cache memory is a special form of
high-speed memory. The
first cache memories appeared in the early sixties.
That was back when
computers filled a room and were still called "giant
brains." RAMs were
slow and expensive in those days. A RAM in the sixties
comprised gazillions
of tiny Cheerio-shaped cores made out of iron strung
together with hair-thin
copper wires, forming blocks the size of bricks. Core
memories were slow,
and the bigger you made them the slower they operated.
The cache memory
-- also called a 'cache buffer' -- made 'large core
storage' operate faster.
Later cache technology was used to buffer disks,
increasing their effective
speed.
The cache idea is based on an attribute
of information
itself, whether stored in chips and floppies or in
brains and books. At
any given instant, an item of information possesses a
kind of 'vitality,'
which is based on the likelihood that it will soon be
needed either for
computing or for thinking. Obviously, if an item in
memory is about to
be used, you want it to be available fast. Its access
time is more significant
than that of other items, which may not be needed for
a long time.
-
A public figure reads his or her
briefing documents just
before a press conference.
-
A pilot reviews the chart for an
airport just before initiating
an approach for landing.
So too, in a computer, you would like to
have the machine
automatically upgrade information into higher speed
memory just before
it is needed. But how does a mindless machine figure out
ahead of time
which items should be thus 'promoted'? In other words,
what does a cache
do?
Computers, as mentioned earlier, don't
make references
to memory in a 'random' order, despite the R in RAM.
If they did, cache
buffering would not work. Instead, computers tend to
access information
in blocks (items in a record) and repeatedly (loops).
The
cache exploits both of these tendencies.
For example, whenever the machine shows
an interest in
a given item by deliberately addressing it, the cache
automatically promotes
that and adjacent items -- often associated
information -- into a high-speed
form of memory.
A subsequent access
is first directed
to the cache, where, more often than not, the needed
item will be found
quickly. Only when the requisite information is not
found in the cache,
does the access get re-directed to the slower form of
memory.
Ah, you exclaim, but the cache will fill
up!
True enough. As a practical matter, to be
fast and cheap,
cache buffers cannot be large. With the cache full, the
whole machine decelerates.
The pace is then set by the slowest memory devices, as
accesses become
increasingly re-directed to them.
Something has to give. Information must
be thrown out
of the cache ('demoted') to make room for newly
accessed strings of items.
The mindless machine must be endowed with the ability
to decide which items
to demote. What is the most 'reasonable' (to use a
brain-like term) basis
for that decision?
Your intuition is probably telling you
the answer: Throw
out the information that has been accessed the longest
time ago.
That's roughly what your own brain does.
Thus, do repeatedly
accessed memories -- recollections -- stay fresh.
Information not recalled
often enough gradually will fade into oblivion.
s
applied in web browsers, the 'cache' improves the
apparent performance
of the Internet -- which is essentially a gigantic
disk memory distributed
all over the world and accessed by a communications
protocol.
The cache stores on your own hard drive a copy of each
page and image as
you acquire them from various websites. If you
click "Back", you
will get that same information not from the web but
from your own memory
-- a whole lot faster. The cache software is smart
enough to recognize
that what you are accessing is available
locally. If you come back
to cached pages or images before they get shoved out
of your computer's
memory, then you will enjoy the high performance that
caches were invented
to provide.
Nota bene: Websites
have visitors, not residents.
New visitors to a given site will suffer the
download delay that I like
to call 'network viscosity'. A short
time later, when they come back,
visitors will reap the benefits of the
cache. Now, the people who design
or manage each website, naturally, come back
again and again, each time
enjoying cache-supported performance. This may
be an explanation for some
of the more viscous sites that sprinkle
graphics and Java-jive all over
your screen -- the worst being pages
bombarding you with 'banner ads',
which peremptorially fill up caches. Heck, the
owners of those sites ('webmasters',
they call themselves) don't ever see just how
stultifying those delays
can be to the first-time or infrequent
visitor. Browsers provide commands
for emptying the cache, and I think every
website manager needs to give
his or her own cache a good flushing every
once in awhile. |
Lack of creativity in computer language
bothers me a whole
lot, hence my habitual coining of neologisms, which
seldom do more than
torment my readers. {Footnote}
Still,
inventing a new word is easy compared to instigating
public usage.
Better, it seems, to adopt a word that is already
accepted -- preferably
from English (like 'memory') or another language (like
'cache').
Consider the expression 'least recently
used.' It is not
the
same as 'oldest.' An item that is 'least recently
used' may be neither
'least useful' nor 'least valuable.' And 'obsolete'
misses the point altogether.
For expressing 'least recently used', I found nothing
that even came close. It's a
little bit like trying to find an English word for chutzpah.
Until a Japanese friend gave me mukashi.
But I was unable to stir up much interest
in the term before
moving on to other work. I shall never have an
opportunity to say something
like, "Performance of the cachecade memory depends
greatly on the details
of its mukashi algorithm."
SideBar
One of my small contributions to
cache technology, memorialized
in U.S. Patent No. 3,938,097, is a hierarchical
cache. Groaner Alert: I
called it 'cachecade.' Hey, at least it was an
attempt at inventive language.
{Return}
Footnote
Check What's
Not in a Name?
for lamentations about the language of The Software
Age. A related feature
article in S:TM led to a memoir Softword:
Provenance for the Word 'Software',
which gives the ironic history
of what is arguably the most important word coined
in a hundred years.
The sophisticated reader may want to review Software
Does Not Fail to become
disabused of language-intensive
cyber-myths. {Return}
|