|
mukashi Of or pertaining to an item, usually information, that has not been used recently. Distinguished from oldest, least useful, or least valuable. [From Japanese mukashi, an introductory expression similar to the English "Once upon a time..." and connoting "Here begins a story which has not been told for the longest time..."] |
ronic, isn't it. Everybody has a memory, but nobody can explain how it works. Memories in brains differ from computer memories. Most people finally understand that. Brains learn, for one thing. You don't load them, they learn. They forget, too, even when the power stays on. Information inside a human memory fades away over a period of time. Not all information, though. Everybody has -- well, memories. Fond or otherwise. Computers find information inside their memories mostly by addressing, sometimes by scanning, never by thinking. Brains work by association, we think. Units of information are coupled together by invisible, elastic bonds. There are dynamic processes at work, too. Circulating within a baffling neuronic structure are electrochemical impulses linking one idea to another -- by relevance or by common attribute or by whatever it is that nourishes familiarity. Brains recall a memory by consciously stimulating the content of the memory itself, or part of it. Stirring things up in there somehow. The recalling process reinforces the content, selectively preserving the memory.
rains work fast. We can recognize a face quicker than any computer. That will probably be true until well into the 21st century. With all their 'megaflops,' computers cannot process visual information fast enough to catch a fly ball -- cannot even tell a left hand from a right hand at a glance. Computers have an awful time recognizing individual words embedded in continuous speech -- especially accented speech, heard over a noisy telephone line -- in realtime. Our central nervous systems operate at a speed sufficient to keep us out of most kinds of trouble. But, when it comes to capacity, human memories, like everything else, are limited. Along comes writing, which makes for extrasomatic storage of information. Reading puts it back into a brain for processing. That takes time, though. Sometimes too much.
Only persons who have been living in a cave since the middle of the Twentieth Century would not know that computers have various kinds of memories. To describe three: First there is ROM, for read-only memory. ROM contains information which was built into the machine at the factory. ROM provides the computer primitive capabilities -- unchanging procedures for interpreting the keyboard, for shaping the characters on the screen, things like that. Human ROM, one might say, gave us two things...
Cache memory is a special form of high-speed memory. The first cache memories appeared in the early sixties. That was back when computers filled a room and were still called "giant brains." RAMs were slow and expensive in those days. A RAM in the sixties comprised gazillions of tiny Cheerio-shaped cores made out of iron strung together with hair-thin copper wires, forming blocks the size of bricks. Core memories were slow, and the bigger you made them the slower they operated. The cache memory -- also called a 'cache buffer' -- made 'large core storage' operate faster. Later cache technology was used to buffer disks, increasing their effective speed. The cache idea is based on an attribute of information itself, whether stored in chips and floppies or in brains and books. At any given instant, an item of information possesses a kind of 'vitality,' which is based on the likelihood that it will soon be needed either for computing or for thinking. Obviously, if an item in memory is about to be used, you want it to be available fast. Its access time is more significant than that of other items, which may not be needed for a long time.
Computers, as mentioned earlier, don't make references to memory in a 'random' order, despite the R in RAM. If they did, cache buffering would not work. Instead, computers tend to access information in blocks (items in a record) and repeatedly (loops). The cache exploits both of these tendencies.
Something has to give. Information must be thrown out of the cache ('demoted') to make room for newly accessed strings of items. The mindless machine must be endowed with the ability to decide which items to demote. What is the most 'reasonable' (to use a brain-like term) basis for that decision?
s
applied in web browsers, the 'cache' improves the
apparent performance
of the Internet -- which is essentially a gigantic
disk memory distributed
all over the world and accessed by a communications
protocol.
The cache stores on your own hard drive a copy of each
page and image as
you acquire them from various websites. If you
click "Back", you
will get that same information not from the web but
from your own memory
-- a whole lot faster. The cache software is smart
enough to recognize
that what you are accessing is available
locally. If you come back
to cached pages or images before they get shoved out
of your computer's
memory, then you will enjoy the high performance that
caches were invented
to provide.
Lack of creativity in computer language bothers me a whole lot, hence my habitual coining of neologisms, which seldom do more than torment my readers. {Footnote} Still, inventing a new word is easy compared to instigating public usage. Better, it seems, to adopt a word that is already accepted -- preferably from English (like 'memory') or another language (like 'cache'). Consider the expression 'least recently used.' It is not the same as 'oldest.' An item that is 'least recently used' may be neither 'least useful' nor 'least valuable.' And 'obsolete' misses the point altogether. For expressing 'least recently used', I found nothing that even came close. It's a little bit like trying to find an English word for chutzpah.
SideBar |