In the Beginning was the Command Line

by Neal Stephenson


About twenty years ago Jobs and Wozniak, the founders of Apple, came up 
with the
 very strange idea of selling information processing machines for use in 
the hom e. The business took off, and its founders made a lot of money and 
received the credit they deserved for being daring visionaries. But around 
the same time, Bil l Gates and Paul Allen came up with an idea even 
stranger and more fantastical: selling computer operating systems. This 
was much weirder than the idea of Jobs and Wozniak. A computer at least 
had some sort of physical reality to it. It cam e in a box, you could open 
it up and plug it in and watch lights blink. An opera ting system had no 
tangible incarnation at all. It arrived on a disk, of course,
 but the disk was, in effect, nothing more than the box that the OS came 
 in. The product itself was a very long string of ones and zeroes that, 
 when properly in
stalled and coddled, gave you the ability to manipulate other very long 
strings of ones and zeroes. Even those few who actually understood what a 
computer opera ting system was were apt to think of it as a fantastically 
arcane engineering pr odigy, like a breeder reactor or a U-2 spy plane, 
and not something that could e ver be (in the parlance of high-tech) 
"productized."

Yet now the company that Gates and Allen founded is selling operating 
systems li ke Gillette sells razor blades. New releases of operating 
systems are launched a s if they were Hollywood blockbusters, with 
celebrity endorsements, talk show ap pearances, and world tours. The 
market for them is vast enough that people worry
 about whether it has been monopolized by one company. Even the least 
technicall y-minded people in our society now have at least a hazy idea of 
what operating s ystems do; what is more, they have strong opinions about 
their relative merits. It is commonly understood, even by technically 
unsophisticated computer users, t hat if you have a piece of software that 
works on your Macintosh, and you move i t over onto a Windows machine, it 
will not run. That this would, in fact, be a l aughable and idiotic 
mistake, like nailing horseshoes to the tires of a Buick.

A person who went into a coma before Microsoft was founded, and woke up 
now, cou ld pick up this morning's New York Times and understand 
everything in it--almost :


Item: the richest man in the world made his fortune from-what? Railways? 
Shippin g? Oil? No, operating systems. Item: the Department of Justice is 
tackling Micro soft's supposed OS monopoly with legal tools that were 
invented to restrain the power of Nineteenth-Century robber barons. Item: 
a woman friend of mine recently
 told me that she'd broken off a (hitherto) stimulating exchange of e-mail 
with a young man. At first he had seemed like such an intelligent and 
interesting guy , she said, but then "he started going all PC-versus-Mac 
on me."

What the hell is going on here? And does the operating system business 
have a fu ture, or only a past? Here is my view, which is entirely 
subjective; but since I
 have spent a fair amount of time not only using, but programming, 
Macintoshes, Windows machines, Linux boxes and the BeOS, perhaps it is not 
so ill-informed as
 to be completely worthless. This is a subjective essay, more review than 
resear ch paper, and so it might seem unfair or biased compared to the 
technical review s you can find in PC magazines. But ever since the Mac 
came out, our operating s ystems have been based on metaphors, and 
anything with metaphors in it is fair g ame as far as I'm concerned.


MGBs, TANKS, AND BATMOBILES

Around the time that Jobs, Wozniak, Gates, and Allen were dreaming up 
these unli kely schemes, I was a teenager living in Ames, Iowa. One of my 
friends' dads had
 an old MGB sports car rusting away in his garage. Sometimes he would 
actually m anage to get it running and then he would take us for a spin 
around the block, w ith a memorable look of wild youthful exhiliration on 
his face; to his worried p assengers, he was a madman, stalling and 
backfiring around Ames, Iowa and eating
 the dust of rusty Gremlins and Pintos, but in his own mind he was Dustin 
Hoffma n tooling across the Bay Bridge with the wind in his hair.

In retrospect, this was telling me two things about people's relationship 
to tec hnology. One was that romance and image go a long way towards 
shaping their opin ions. If you doubt it (and if you have a lot of spare 
time on your hands) just a sk anyone who owns a Macintosh and who, on 
those grounds, imagines him- or herse lf to be a member of an oppressed 
minority group.

The other, somewhat subtler point, was that interface is very important. 
Sure, t he MGB was a lousy car in almost every way that counted: balky, 
unreliable, unde rpowered. But it was fun to drive. It was responsive. 
Every pebble on the road w as felt in the bones, every nuance in the 
pavement transmitted instantly to the driver's hands. He could listen to 
the engine and tell what was wrong with it. T he steering responded 
immediately to commands from his hands. To us passengers i t was a 
pointless exercise in going nowhere--about as interesting as peering ove r 
someone's shoulder while he punches numbers into a spreadsheet. But to the 
dri ver it was an experience. For a short time he was extending his body 
and his sen ses into a larger realm, and doing things that he couldn't do 
unassisted.

The analogy between cars and operating systems is not half bad, and so let 
me ru n with it for a moment, as a way of giving an executive summary of 
our situation
 today.

Imagine a crossroads where four competing auto dealerships are situated. 
One of them (Microsoft) is much, much bigger than the others. It started 
out years ago selling three-speed bicycles (MS-DOS); these were not 
perfect, but they worked, and when they broke you could easily fix them.

There was a competing bicycle dealership next door (Apple) that one day 
began se lling motorized vehicles--expensive but attractively styled cars 
with their inna rds hermetically sealed, so that how they worked was 
something of a mystery.

The big dealership responded by rushing a moped upgrade kit (the original 
Window s) onto the market. This was a Rube Goldberg contraption that, when 
bolted onto a three-speed bicycle, enabled it to keep up, just barely, 
with Apple-cars. The users had to wear goggles and were always picking 
bugs out of their teeth while Apple owners sped along in hermetically 
sealed comfort, sneering out the windows . But the Micro-mopeds were 
cheap, and easy to fix compared with the Apple-cars,
 and their market share waxed.

Eventually the big dealership came out with a full-fledged car: a colossal 
stati on wagon (Windows 95). It had all the aesthetic appeal of a Soviet 
worker housin g block, it leaked oil and blew gaskets, and it was an 
enormous success. A littl e later, they also came out with a hulking 
off-road vehicle intended for industr ial users (Windows NT) which was no 
more beautiful than the station wagon, and o nly a little more reliable.

Since then there has been a lot of noise and shouting, but little has 
changed. T he smaller dealership continues to sell sleek Euro-styled 
sedans and to spend a lot of money on advertising campaigns. They have had 
GOING OUT OF BUSINESS! sign s taped up in their windows for so long that 
they have gotten all yellow and cur ly. The big one keeps making bigger 
and bigger station wagons and ORVs.

On the other side of the road are two competitors that have come along 
more rece ntly.

One of them (Be, Inc.) is selling fully operational Batmobiles (the BeOS). 
They are more beautiful and stylish even than the Euro-sedans, better 
designed, more technologically advanced, and at least as reliable as 
anything else on the marke t--and yet cheaper than the others.

With one exception, that is: Linux, which is right next door, and which is 
not a
 business at all. It's a bunch of RVs, yurts, tepees, and geodesic domes 
set up in a field and organized by consensus. The people who live there 
are making tank s. These are not old-fashioned, cast-iron Soviet tanks; 
these are more like the M1 tanks of the U.S. Army, made of space-age 
materials and jammed with sophistic ated technology from one end to the 
other. But they are better than Army tanks. They've been modified in such 
a way that they never, ever break down, are light and maneuverable enough 
to use on ordinary streets, and use no more fuel than a subcompact car. 
These tanks are being cranked out, on the spot, at a terrific pa ce, and a 
vast number of them are lined up along the edge of the road with keys in 
the ignition. Anyone who wants can simply climb into one and drive it away 
fo r free.

Customers come to this crossroads in throngs, day and night. Ninety 
percent of t hem go straight to the biggest dealership and buy station 
wagons or off-road veh icles. They do not even look at the other 
dealerships.

Of the remaining ten percent, most go and buy a sleek Euro-sedan, pausing 
only t o turn up their noses at the philistines going to buy the station 
wagons and ORV s. If they even notice the people on the opposite side of 
the road, selling the cheaper, technically superior vehicles, these 
customers deride them cranks and h alf-wits.

The Batmobile outlet sells a few vehicles to the occasional car nut who 
wants a second vehicle to go with his station wagon, but seems to accept, 
at least for n ow, that it's a fringe player.

The group giving away the free tanks only stays alive because it is 
staffed by v olunteers, who are lined up at the edge of the street with 
bullhorns, trying to draw customers' attention to this incredible 
situation. A typical conversation g oes something like this:

Hacker with bullhorn: "Save your money! Accept one of our free tanks! It 
is invu lnerable, and can drive across rocks and swamps at ninety miles an 
hour while ge tting a hundred miles to the gallon!"

Prospective station wagon buyer: "I know what you say is 
true...but...er...I don 't know how to maintain a tank!"

Bullhorn: "You don't know how to maintain a station wagon either!"

Buyer: "But this dealership has mechanics on staff. If something goes 
wrong with
 my station wagon, I can take a day off work, bring it here, and pay them 
to wor k on it while I sit in the waiting room for hours, listening to 
elevator music."

Bullhorn: "But if you accept one of our free tanks we will send volunteers 
to yo ur house to fix it for free while you sleep!"

Buyer: "Stay away from my house, you freak!"

Bullhorn: "But..."

Buyer: "Can't you see that everyone is buying station wagons?"


BIT-FLINGER


The connection between cars, and ways of interacting with computers, 
wouldn't ha ve occurred to me at the time I was being taken for rides in 
that MGB. I had sig ned up to take a computer programming class at Ames 
High School. After a few int roductory lectures, we students were granted 
admission into a tiny room containi ng a teletype, a telephone, and an 
old-fashioned modem consisting of a metal box
 with a pair of rubber cups on the top (note: many readers, making their 
way thr ough that last sentence, probably felt an initial pang of dread 
that this essay was about to turn into a tedious, codgerly reminiscence 
about how tough we had i t back in the old days; rest assured that I am 
actually positioning my pieces on
 the chessboard, as it were, in preparation to make a point about truly 
hip and up-to-the minute topics like Open Source Software). The teletype 
was exactly the
 same sort of machine that had been used, for decades, to send and receive 
teleg rams. It was basically a loud typewriter that could only produce 
UPPERCASE LETTE RS. Mounted to one side of it was a smaller machine with a 
long reel of paper ta pe on it, and a clear plastic hopper underneath.

In order to connect this device (which was not a computer at all) to the 
Iowa St ate University mainframe across town, you would pick up the phone, 
dial the comp uter's number, listen for strange noises, and then slam the 
handset down into th e rubber cups. If your aim was true, one would wrap 
its neoprene lips around the
 earpiece and the other around the mouthpiece, consummating a kind of 
informatio nal soixante-neuf.  The teletype would shudder as it was 
possessed by the spirit
 of the distant mainframe, and begin to hammer out cryptic messages.

Since computer time was a scarce resource, we used a sort of batch 
processing te chnique. Before dialing the phone, we would turn on the tape 
puncher (a subsidia ry machine bolted to the side of the teletype) and 
type in our programs. Each ti me we depressed a key, the teletype would 
bash out a letter on the paper in fron t of us, so we could read what we'd 
typed; but at the same time it would convert
 the letter into a set of eight binary digits, or bits, and punch a 
correspondin g pattern of holes across the width of a paper tape. The tiny 
disks of paper kno cked out of the tape would flutter down into the clear 
plastic hopper, which wou ld slowly fill up what can only be described as 
actual bits. On the last day of the school year, the smartest kid in the 
class (not me) jumped out from behind h is desk and flung several quarts 
of these bits over the head of our teacher, lik e confetti, as a sort of 
semi-affectionate practical joke. The image of this man
 sitting there, gripped in the opening stages of an atavistic 
fight-or-flight re action, with millions of bits (megabytes) sifting down 
out of his hair and into his nostrils and mouth, his face gradually 
turning purple as he built up to an e xplosion, is the single most 
memorable scene from my formal education.

Anyway, it will have been obvious that my interaction with the computer 
was of a n extremely formal nature, being sharply divided up into 
different phases, viz.:
 (1) sitting at home with paper and pencil, miles and miles from any 
 computer, I would think very, very hard about what I wanted the computer 
 to do, and transla
te my intentions into a computer language--a series of alphanumeric 
symbols on a
 page. (2) I would carry this across a sort of informational cordon 
sanitaire (t hree miles of snowdrifts) to school and type those letters 
into a machine--not a
 computer--which would convert the symbols into binary numbers and record 
them v isibly on a tape. (3) Then, through the rubber-cup modem, I would 
cause those nu mbers to be sent to the university mainframe, which would 
(4) do arithmetic on t hem and send different numbers back to the 
teletype. (5) The teletype would conv ert these numbers back into letters 
and hammer them out on a page and (6) I, wat ching, would construe the 
letters as meaningful symbols.

The division of responsibilities implied by all of this is admirably 
clean: comp uters do arithmetic on bits of information. Humans construe 
the bits as meaningf ul symbols. But this distinction is now being 
blurred, or at least complicated, by the advent of modern operating 
systems that use, and frequently abuse, the po wer of metaphor to make 
computers accessible to a larger audience. Along the way --possibly 
because of those metaphors, which make an operating system a sort of work 
of art--people start to get emotional, and grow attached to pieces of 
softw are in the way that my friend's dad did to his MGB.

People who have only interacted with computers through graphical user 
interfaces
 like the MacOS or Windows--which is to say, almost everyone who has ever 
 used a computer--may have been startled, or at least bemused, to hear 
 about the telegr
aph machine that I used to communicate with a computer in 1973. But there 
was, a nd is, a good reason for using this particular kind of technology. 
Human beings have various ways of communicating to each other, such as 
music, art, dance, and
 facial expressions, but some of these are more amenable than others to 
being ex pressed as strings of symbols. Written language is the easiest of 
all, because, of course, it consists of strings of symbols to begin with. 
If the symbols happe n to belong to a phonetic alphabet (as opposed to, 
say, ideograms), converting t hem into bits is a trivial procedure, and 
one that was nailed, technologically, in the early nineteenth century, 
with the introduction of Morse code and other f orms of telegraphy.

We had a human/computer interface a hundred years before we had computers. 
When computers came into being around the time of the Second World War, 
humans, quite
 naturally, communicated with them by simply grafting them on to the 
already-exi sting technologies for translating letters into bits and vice 
versa: teletypes a nd punch card machines.

These embodied two fundamentally different approaches to computing. When 
you wer e using cards, you'd punch a whole stack of them and run them 
through the reader
 all at once, which was called batch processing. You could also do batch 
process ing with a teletype, as I have already described, by using the 
paper tape reader , and we were certainly encouraged to use this approach 
when I was in high schoo l. But--though efforts were made to keep us 
unaware of this--the teletype could do something that the card reader 
could not. On the teletype, once the modem lin k was established, you 
could just type in a line and hit the return key. The tel etype would send 
that line to the computer, which might or might not respond wit h some 
lines of its own, which the teletype would hammer out--producing, over ti 
me, a transcript of your exchange with the machine. This way of doing it 
did not
 even have a name at the time, but when, much later, an alternative became 
avail able, it was retroactively dubbed the Command Line Interface.

When I moved on to college, I did my computing in large, stifling rooms 
where sc ores of students would sit in front of slightly updated versions 
of the same mac hines and write computer programs: these used dot-matrix 
printing mechanisms, bu t were (from the computer's point of view) 
identical to the old teletypes. By th at point, computers were better at 
time-sharing--that is, mainframes were still mainframes, but they were 
better at communicating with a large number of termina ls at once. 
Consequently, it was no longer necessary to use batch processing. Ca rd 
readers were shoved out into hallways and boiler rooms, and batch 
processing became a nerds-only kind of thing, and consequently took on a 
certain eldritch f lavor among those of us who even knew it existed. We 
were all off the Batch, and
 on the Command Line, interface now--my very first shift in operating 
system par adigms, if only I'd known it.

A huge stack of accordion-fold paper sat on the floor underneath each one 
of the se glorified teletypes, and miles of paper shuddered through their 
platens. Almo st all of this paper was thrown away or recycled without 
ever having been touche d by ink--an ecological atrocity so glaring that 
those machines soon replaced by
 video terminals--so-called "glass teletypes"--which were quieter and 
didn't was te paper. Again, though, from the computer's point of view 
these were indistingu ishable from World War II-era teletype machines. In 
effect we still used Victori an technology to communicate with computers 
until about 1984, when the Macintosh
 was introduced with its Graphical User Interface. Even after that, the 
Command Line continued to exist as an underlying stratum--a sort of 
brainstem reflex--of
 many modern computer systems all through the heyday of Graphical User 
Interface s, or GUIs as I will call them from now on.


GUIs


Now the first job that any coder needs to do when writing a new piece of 
softwar e is to figure out how to take the information that is being 
worked with (in a g raphics program, an image; in a spreadsheet, a grid of 
numbers) and turn it into
 a linear string of bytes. These strings of bytes are commonly called 
files or ( somewhat more hiply) streams. They are to telegrams what modern 
humans are to Cr o-Magnon man, which is to say the same thing under a 
different name. All that yo u see on your computer screen--your Tomb 
Raider, your digitized voice mail messa ges, faxes, and word processing 
documents written in thirty-seven different type faces--is still, from the 
computer's point of view, just like telegrams, except much longer, and 
demanding of more arithmetic.

The quickest way to get a taste of this is to fire up your web browser, 
visit a site, and then select the View/Document Source menu item. You will 
get a bunch o f computer code that looks something like this:

<HTML> <HEAD>
        <TITLE> C R Y P T O N O M I C O N</TITLE>

</HEAD> 

<BODY BGCOLOR="#000000" LINK="#996600" ALINK="#FFFFFF" 
VLINK="#663300">

<MAP NAME="navtext">
        <AREA SHAPE=RECT HREF="praise.html" COORDS="0,37,84,55">
        <AREA SHAPE=RECT HREF="author.html" COORDS="0,59,137,75">
        <AREA SHAPE=RECT HREF="text.html" COORDS="0,81,101,96">
        <AREA SHAPE=RECT HREF="tour.html" COORDS="0,100,121,117">
        <AREA SHAPE=RECT HREF="order.html" COORDS="0,122,143,138">
        <AREA SHAPE=RECT HREF="beginning.html" COORDS="0,140,213,157">
</MAP>


<CENTER>
<TABLE BORDER="0" CELLPADDING="0" CELLSPACING="0" WIDTH="520">
<TR>

        <TD VALIGN=TOP ROWSPAN="5">
        <IMG SRC="images/spacer.gif" WIDTH="30" HEIGHT="1" BORDER="0">
        </TD>

        <TD VALIGN=TOP COLSPAN="2">
        <IMG SRC="images/main_banner.gif" ALT="Cryptonomincon by Neal
Stephenson" WIDTH="479" HEIGHT="122" BORDER="0">
        </TD>

</TR>

This crud is called HTML (HyperText Markup Language) and it is basically a 
very simple programming language instructing your web browser how to draw 
a page on a
 screen. Anyone can learn HTML and many people do. The important thing is 
that n o matter what splendid multimedia web pages they might represent, 
HTML files are
 just telegrams.

When Ronald Reagan was a radio announcer, he used to call baseball games 
by read ing the terse descriptions that trickled in over the telegraph 
wire and were pri nted out on a paper tape. He would sit there, all by 
himself in a padded room wi th a microphone, and the paper tape would eke 
out of the machine and crawl over the palm of his hand printed with 
cryptic abbreviations. If the count went to th ree and two, Reagan would 
describe the scene as he saw it in his mind's eye: "Th e brawny 
left-hander steps out of the batter's box to wipe the sweat from his br 
ow. The umpire steps forward to sweep the dirt from home plate." and so 
on. When
 the cryptogram on the paper tape announced a base hit, he would whack the 
edge of the table with a pencil, creating a little sound effect, and 
describe the arc
 of the ball as if he could actually see it. His listeners, many of whom 
presuma bly thought that Reagan was actually at the ballpark watching the 
game, would re construct the scene in their minds according to his 
descriptions.

This is exactly how the World Wide Web works: the HTML files are the pithy 
descr iption on the paper tape, and your Web browser is Ronald Reagan. The 
same is tru e of Graphical User Interfaces in general.

So an OS is a stack of metaphors and abstractions that stands between you 
and th e telegrams, and embodying various tricks the programmer used to 
convert the inf ormation you're working with--be it images, e-mail 
messages, movies, or word pro cessing documents--into the necklaces of 
bytes that are the only things computer s know how to work with. When we 
used actual telegraph equipment (teletypes) or their higher-tech 
substitutes ("glass teletypes," or the MS-DOS command line) to
 work with our computers, we were very close to the bottom of that stack. 
When w e use most modern operating systems, though, our interaction with 
the machine is
 heavily mediated. Everything we do is interpreted and translated time and 
 again as it works its way down through all of the metaphors and 
 abstractions.

The Macintosh OS was a revolution in both the good and bad senses of that 
word. Obviously it was true that command line interfaces were not for 
everyone, and th at it would be a good thing to make computers more 
accessible to a less technica l audience--if not for altruistic reasons, 
then because those sorts of people co nstituted an incomparably vaster 
market. It was clear the the Mac's engineers sa w a whole new country 
stretching out before them; you could almost hear them mut tering, "Wow! 
We don't have to be bound by files as linear streams of bytes anym ore, 
vive la revolution, let's see how far we can take this!" No command line 
in terface was available on the Macintosh; you talked to it with the 
mouse, or not at all. This was a statement of sorts, a credential of 
revolutionary purity. It seemed that the designers of the Mac intended to 
sweep Command Line Interfaces i nto the dustbin of history.

My own personal love affair with the Macintosh began in the spring of 1984 
in a computer store in Cedar Rapids, Iowa, when a friend of 
mine--coincidentally, the
 son of the MGB owner--showed me a Macintosh running MacPaint, the 
 revolutionary drawing program. It ended in July of 1995 when I tried to 
 save a big important
file on my Macintosh Powerbook and instead instead of doing so, it 
annihilated t he data so thoroughly that two different disk crash utility 
programs were unable
 to find any trace that it had ever existed. During the intervening ten 
 years, I had a passion for the MacOS that seemed righteous and reasonable 
 at the time bu
t in retrospect strikes me as being exactly the same sort of goofy 
infatuation t hat my friend's dad had with his car.

The introduction of the Mac triggered a sort of holy war in the computer 
world. Were GUIs a brilliant design innovation that made computers more 
human-centered and therefore accessible to the masses, leading us toward 
an unprecedented revol ution in human society, or an insulting bit of 
audiovisual gimcrackery dreamed u p by flaky Bay Area hacker types that 
stripped computers of their power and flex ibility and turned the noble 
and serious work of computing into a childish video
 game?

This debate actually seems more interesting to me today than it did in the 
mid-1 980s. But people more or less stopped debating it when Microsoft 
endorsed the id ea of GUIs by coming out with the first Windows. At this 
point, command-line par tisans were relegated to the status of silly old 
grouches, and a new conflict wa s touched off, between users of MacOS and 
users of Windows.

There was plenty to argue about. The first Macintoshes looked different 
from oth er PCs even when they were turned off: they consisted of one box 
containing both
 CPU (the part of the computer that does arithmetic on bits) and monitor 
 screen. This was billed, at the time, as a philosophical statement of 
 sorts: Apple want
ed to make the personal computer into an appliance, like a toaster. But it 
also reflected the purely technical demands of running a graphical user 
interface. In
 a GUI machine, the chips that draw things on the screen have to be 
integrated w ith the computer's central processing unit, or CPU, to a far 
greater extent than
 is the case with command-line interfaces, which until recently didn't 
 even know that they weren't just talking to teletypes.

This distinction was of a technical and abstract nature, but it became 
clearer w hen the machine crashed (it is commonly the case with 
technologies that you can get the best insight about how they work by 
watching them fail). When everything
 went to hell and the CPU began spewing out random bits, the result, on a 
CLI ma chine, was lines and lines of perfectly formed but random 
characters on the scre en--known to cognoscenti as "going Cyrillic." But 
to the MacOS, the screen was n ot a teletype, but a place to put graphics; 
the image on the screen was a bitmap , a literal rendering of the contents 
of a particular portion of the computer's memory. When the computer 
crashed and wrote gibberish into the bitmap, the resul t was something 
that looked vaguely like static on a broken television set--a "s now 
crash."

And even after the introduction of Windows, the underlying differences 
endured; when a Windows machine got into trouble, the old command-line 
interface would fa ll down over the GUI like an asbestos fire curtain 
sealing off the proscenium of
 a burning opera. When a Macintosh got into trouble it presented you with 
a cart oon of a bomb, which was funny the first time you saw it.

And these were by no means superficial differences. The reversion of 
Windows to a CLI when it was in distress proved to Mac partisans that 
Windows was nothing m ore than a cheap facade, like a garish afghan flung 
over a rotted-out sofa. They
 were disturbed and annoyed by the sense that lurking underneath Windows' 
ostens ibly user-friendly interface was--literally--a subtext.

For their part, Windows fans might have made the sour observation that all 
compu ters, even Macintoshes, were built on that same subtext, and that 
the refusal of
 Mac owners to admit that fact to themselves seemed to signal a 
willingness, alm ost an eagerness, to be duped.

Anyway, a Macintosh had to switch individual bits in the memory chips on 
the vid eo card, and it had to do it very fast, and in arbitrarily 
complicated patterns.
 Nowadays this is cheap and easy, but in the technological regime that 
 prevailed in the early 1980s, the only realistic way to do it was to 
 build the motherboar
d (which contained the CPU) and the video system (which contained the 
memory tha t was mapped onto the screen) as a tightly integrated 
whole--hence the single, h ermetically sealed case that made the Macintosh 
so distinctive.

When Windows came out, it was conspicuous for its ugliness, and its 
current succ essors, Windows 95 and Windows NT, are not things that people 
would pay money to
 look at either. Microsoft's complete disregard for aesthetics gave all of 
us Ma c-lovers plenty of opportunities to look down our noses at them. 
That Windows lo oked an awful lot like a direct ripoff of MacOS gave us a 
burning sense of moral
 outrage to go with it. Among people who really knew and appreciated 
computers ( hackers, in Steven Levy's non-pejorative sense of that word) 
and in a few other niches such as professional musicians, graphic artists 
and schoolteachers, the M acintosh, for a while, was simply the computer. 
It was seen as not only a superb
 piece of engineering, but an embodiment of certain ideals about the use 
of tech nology to benefit mankind, while Windows was seen as a 
pathetically clumsy imita tion and a sinister world domination plot rolled 
into one. So very early, a patt ern had been established that endures to 
this day: people dislike Microsoft, whi ch is okay; but they dislike it 
for reasons that are poorly considered, and in t he end, self-defeating.


CLASS STRUGGLE ON THE DESKTOP

Now that the Third Rail has been firmly grasped, it is worth reviewing 
some basi c facts here: like any other publicly traded, for-profit 
corporation, Microsoft has, in effect, borrowed a bunch of money from some 
people (its stockholders) in
 order to be in the bit business. As an officer of that corporation, Bill 
Gates has one responsibility only, which is to maximize return on 
investment. He has d one this incredibly well. Any actions taken in the 
world by Microsoft-any softwa re released by them, for example--are 
basically epiphenomena, which can't be int erpreted or understood except 
insofar as they reflect Bill Gates's execution of his one and only 
responsibility.

It follows that if Microsoft sells goods that are aesthetically 
unappealing, or that don't work very well, it does not mean that they are 
(respectively) philist ines or half-wits. It is because Microsoft's 
excellent management has figured ou t that they can make more money for 
their stockholders by releasing stuff with o bvious, known imperfections 
than they can by making it beautiful or bug-free. Th is is annoying, but 
(in the end) not half so annoying as watching Apple inscruta bly and 
relentlessly destroy itself.

Hostility towards Microsoft is not difficult to find on the Net, and it 
blends t wo strains: resentful people who feel Microsoft is too powerful, 
and disdainful people who think it's tacky. This is all strongly 
reminiscent of the heyday of C ommunism and Socialism, when the 
bourgeoisie were hated from both ends: by the p roles, because they had 
all the money, and by the intelligentsia, because of the ir tendency to 
spend it on lawn ornaments. Microsoft is the very embodiment of m odern 
high-tech prosperity--it is, in a word, bourgeois--and so it attracts all 
of the same gripes.

The opening "splash screen" for Microsoft Word 6.0 summed it up pretty 
neatly: w hen you started up the program you were treated to a picture of 
an expensive ena mel pen lying across a couple of sheets of fancy-looking 
handmade writing paper.
 It was obviously a bid to make the software look classy, and it might 
have work ed for some, but it failed for me, because the pen was a 
ballpoint, and I'm a fo untain pen man. If Apple had done it, they 
would've used a Mont Blanc fountain p en, or maybe a Chinese calligraphy 
brush. And I doubt that this was an accident.
 Recently I spent a while re-installing Windows NT on one of my home 
computers, and many times had to double-click on the "Control Panel" icon. 
For reasons that
 are difficult to fathom, this icon consists of a picture of a clawhammer 
and a chisel or screwdriver resting on top of a file folder.

These aesthetic gaffes give one an almost uncontrollable urge to make fun 
of Mic rosoft, but again, it is all beside the point--if Microsoft had 
done focus group
 testing of possible alternative graphics, they probably would have found 
that t he average mid-level office worker associated fountain pens with 
effete upper ma nagement toffs and was more comfortable with ballpoints. 
Likewise, the regular g uys, the balding dads of the world who probably 
bear the brunt of setting up and
 maintaining home computers, can probably relate better to a picture of a 
clawha mmer--while perhaps harboring fantasies of taking a real one to 
their balky comp uters.

This is the only way I can explain certain peculiar facts about the 
current mark et for operating systems, such as that ninety percent of all 
customers continue to buy station wagons off the Microsoft lot while free 
tanks are there for the t aking, right across the street.

A string of ones and zeroes was not a difficult thing for Bill Gates to 
distribu te, one he'd thought of the idea. The hard part was selling 
it--reassuring custo mers that they were actually getting something in 
return for their money.

Anyone who has ever bought a piece of software in a store has had the 
curiously deflating experience of taking the bright shrink-wrapped box 
home, tearing it op en, finding that it's 95 percent air, throwing away 
all the little cards, party favors, and bits of trash, and loading the 
disk into the computer. The end resul t (after you've lost the disk) is 
nothing except some images on a computer scree n, and some capabilities 
that weren't there before. Sometimes you don't even hav e that--you have a 
string of error messages instead. But your money is definitel y gone. Now 
we are almost accustomed to this, but twenty years ago it was a very
 dicey business proposition. Bill Gates made it work anyway. He didn't 
make it w ork by selling the best software or offering the cheapest price. 
Instead he some how got people to believe that they were receiving 
something in exchange for the ir money.

The streets of every city in the world are filled with those hulking, 
rattling s tation wagons. Anyone who doesn't own one feels a little weird, 
and wonders, in spite of himself, whether it might not be time to cease 
resistance and buy one; anyone who does, feels confident that he has 
acquired some meaningful possession , even on those days when the vehicle 
is up on a lift in an auto repair shop.

All of this is perfectly congruent with membership in the bourgeoisie, 
which is as much a mental, as a material state. And it explains why 
Microsoft is regularl y attacked, on the Net, from both sides. People who 
are inclined to feel poor an d oppressed construe everything Microsoft 
does as some sinister Orwellian plot. People who like to think of 
themselves as intelligent and informed technology us ers are driven crazy 
by the clunkiness of Windows.

Nothing is more annoying to sophisticated people to see someone who is 
rich enou gh to know better being tacky--unless it is to realize, a moment 
later, that the y probably know they are tacky and they simply don't care 
and they are going to go on being tacky, and rich, and happy, forever. 
Microsoft therefore bears the s ame relationship to the Silicon Valley 
elite as the Beverly Hillbillies did to t heir fussy banker, Mr. 
Drysdale--who is irritated not so much by the fact that t he Clampetts 
moved to his neighborhood as by the knowledge that, when Jethro is seventy 
years old, he's still going to be talking like a hillbilly and wearing b 
ib overalls, and he's still going to be a lot richer than Mr. Drysdale.

Even the hardware that Windows ran on, when compared to the machines put 
out by Apple, looked like white-trash stuff, and still mostly does. The 
reason was that
 Apple was and is a hardware company, while Microsoft was and is a 
software comp any. Apple therefore had a monopoly on hardware that could 
run MacOS, whereas Wi ndows-compatible hardware came out of a free market. 
The free market seems to ha ve decided that people will not pay for 
cool-looking computers; PC hardware make rs who hire designers to make 
their stuff look distinctive get their clocks clea ned by Taiwanese clone 
makers punching out boxes that look as if they belong on cinderblocks in 
front of someone's trailer. But Apple could make their hardware as pretty 
as they wanted to and simply pass the higher prices on to their besott ed 
consumers, like me. Only last week (I am writing this sentence in early 
Jan. 1999) the technology sections of all the newspapers were filled with 
adulatory p ress coverage of how Apple had released the iMac in several 
happenin' new colors
 like Blueberry and Tangerine.

Apple has always insisted on having a hardware monopoly, except for a 
brief peri od in the mid-1990s when they allowed clone-makers to compete 
with them, before subsequently putting them out of business. Macintosh 
hardware was, consequently,
 expensive. You didn't open it up and fool around with it because doing so 
 would void the warranty. In fact the first Mac was specifically designed 
 to be diffic
ult to open--you needed a kit of exotic tools, which you could buy through 
littl e ads that began to appear in the back pages of magazines a few 
months after the
 Mac came out on the market. These ads always had a certain disreputable 
air abo ut them, like pitches for lock-picking tools in the backs of lurid 
detective mag azines.

This monopolistic policy can be explained in at least three different 
ways.

THE CHARITABLE EXPLANATION is that the hardware monopoly policy reflected 
a driv e on Apple's part to provide a seamless, unified blending of 
hardware, operating
 system, and software. There is something to this. It is hard enough to 
make an OS that works well on one specific piece of hardware, designed and 
tested by eng ineers who work down the hallway from you, in the same 
company. Making an OS to work on arbitrary pieces of hardware, cranked out 
by rabidly entrepeneurial clon emakers on the other side of the 
International Date Line, is very difficult, and
 accounts for much of the troubles people have using Windows.

THE FINANCIAL EXPLANATION is that Apple, unlike Microsoft, is and always 
has bee n a hardware company. It simply depends on revenue from selling 
hardware, and ca nnot exist without it.

THE NOT-SO-CHARITABLE EXPLANATION has to do with Apple's corporate 
culture, whic h is rooted in Bay Area Baby Boomdom.

Now, since I'm going to talk for a moment about culture, full disclosure 
is prob ably in order, to protect myself against allegations of conflict 
of interest and
 ethical turpitude: (1) Geographically I am a Seattleite, of a Saturnine 
tempera ment, and inclined to take a sour view of the Dionysian Bay Area, 
just as they t end to be annoyed and appalled by us. (2) Chronologically I 
am a post-Baby Boome r. I feel that way, at least, because I never 
experienced the fun and exciting p arts of the whole Boomer scene--just 
spent a lot of time dutifully chuckling at Boomers' maddeningly pointless 
anecdotes about just how stoned they got on vario us occasions, and 
politely fielding their assertions about how great their music
 was. But even from this remove it was possible to glean certain patterns, 
and o ne that recurred as regularly as an urban legend was the one about 
how someone w ould move into a commune populated by sandal-wearing, 
peace-sign flashing flower
 children, and eventually discover that, underneath this facade, the guys 
who ra n it were actually control freaks; and that, as living in a 
commune, where much lip service was paid to ideals of peace, love and 
harmony, had deprived them of normal, socially approved outlets for their 
control-freakdom, it tended to come out in other, invariably more 
sinister, ways.

Applying this to the case of Apple Computer will be left as an exercise 
for the reader, and not a very difficult exercise.

It is a bit unsettling, at first, to think of Apple as a control freak, 
because it is completely at odds with their corporate image. Weren't these 
the guys who aired the famous Super Bowl ads showing suited, blindfolded 
executives marching like lemmings off a cliff? Isn't this the company that 
even now runs ads picturi ng the Dalai Lama (except in Hong Kong) and 
Einstein and other offbeat rebels?

It is indeed the same company, and the fact that they have been able to 
plant th is image of themselves as creative and rebellious free-thinkers 
in the minds of so many intelligent and media-hardened skeptics really 
gives one pause. It is te stimony to the insidious power of expensive 
slick ad campaigns and, perhaps, to a certain amount of wishful thinking 
in the minds of people who fall for them. I t also raises the question of 
why Microsoft is so bad at PR, when the history of
 Apple demonstrates that, by writing large checks to good ad agencies, you 
can p lant a corporate image in the minds of intelligent people that is 
completely at odds with reality. (The answer, for people who don't like 
Damoclean questions, i s that since Microsoft has won the hearts and minds 
of the silent majority--the bourgeoisie--they don't give a damn about 
having a slick image, any more then Di ck Nixon did. "I want to 
believe,"--the mantra that Fox Mulder has pinned to his
 office wall in The X-Files--applies in different ways to these two 
companies; M ac partisans want to believe in the image of Apple purveyed 
in those ads, and in
 the notion that Macs are somehow fundamentally different from other 
computers, while Windows people want to believe that they are getting 
something for their m oney, engaging in a respectable business 
transaction).

In any event, as of 1987, both MacOS and Windows were out on the market, 
running
 on hardware platforms that were radically different from each other--not 
only i n the sense that MacOS used Motorola CPU chips while Windows used 
Intel, but in the sense--then overlooked, but in the long run, vastly more 
significant--that t he Apple hardware business was a rigid monopoly and 
the Windows side was a churn ing free-for-all.

But the full ramifications of this did not become clear until very 
recently--in fact, they are still unfolding, in remarkably strange ways, 
as I'll explain when
 we get to Linux. The upshot is that millions of people got accustomed to 
using GUIs in one form or another. By doing so, they made Apple/Microsoft 
a lot of mon ey. The fortunes of many people have become bound up with the 
ability of these c ompanies to continue selling products whose salability 
is very much open to ques tion.


HONEY-POT, TAR-PIT, WHATEVER


When Gates and Allen invented the idea of selling software, they ran into 
critic ism from both hackers and sober-sided businesspeople. Hackers 
understood that so ftware was just information, and objected to the idea 
of selling it. These objec tions were partly moral. The hackers were 
coming out of the scientific and acade mic world where it is imperative to 
make the results of one's work freely availa ble to the public. They were 
also partly practical; how can you sell something t hat can be easily 
copied? Businesspeople, who are polar opposites of hackers in so many 
ways, had objections of their own. Accustomed to selling toasters and in 
surance policies, they naturally had a difficult time understanding how a 
long c ollection of ones and zeroes could constitute a salable product.

Obviously Microsoft prevailed over these objections, and so did Apple. But 
the o bjections still exist. The most hackerish of all the hackers, the 
Ur-hacker as i t were, was and is Richard Stallman, who became so annoyed 
with the evil practic e of selling software that, in 1984 (the same year 
that the Macintosh went on sa le) he went off and founded something called 
the Free Software Foundation, which
 commenced work on something called GNU. Gnu is an acronym for Gnu's Not 
Unix, b ut this is a joke in more ways than one, because GNU most 
certainly IS Unix,. Be cause of trademark concerns ("Unix" is trademarked 
by AT&T) they simply could no t claim that it was Unix, and so, just to be 
extra safe, they claimed that it wa sn't. Notwithstanding the incomparable 
talent and drive possessed by Mr. Stallma n and other GNU adherents, their 
project to build a free Unix to compete against
 Microsoft and Apple's OSes was a little bit like trying to dig a subway 
system with a teaspoon. Until, that is, the advent of Linux, which I will 
get to later.

But the basic idea of re-creating an operating system from scratch was 
perfectly
 sound and completely doable. It has been done many times. It is inherent 
 in the very nature of operating systems.

Operating systems are not strictly necessary. There is no reason why a 
sufficien tly dedicated coder could not start from nothing with every 
project and write fr esh code to handle such basic, low-level operations 
as controlling the read/writ e heads on the disk drives and lighting up 
pixels on the screen. The very first computers had to be programmed in 
this way. But since nearly every program needs
 to carry out those same basic operations, this approach would lead to 
vast dupl ication of effort.

Nothing is more disagreeable to the hacker than duplication of effort. The 
first
 and most important mental habit that people develop when they learn how 
to writ e computer programs is to generalize, generalize, generalize. To 
make their code
 as modular and flexible as possible, breaking large problems down into 
small su broutines that can be used over and over again in different 
contexts. Consequent ly, the development of operating systems, despite 
being technically unnecessary,
 was inevitable. Because at its heart, an operating system is nothing more 
than a library containing the most commonly used code, written once (and 
hopefully wr itten well) and then made available to every coder who needs 
it.

So a proprietary, closed, secret operating system is a contradiction in 
terms. I t goes against the whole point of having an operating system. And 
it is impossib le to keep them secret anyway. The source code--the 
original lines of text writt en by the programmers--can be kept secret. 
But an OS as a whole is a collection of small subroutines that do very 
specific, very clearly defined jobs. Exactly w hat those subroutines do 
has to be made public, quite explicitly and exactly, or
 else the OS is completely useless to programmers; they can't make use of 
those subroutines if they don't have a complete and perfect understanding 
of what the subroutines do.

The only thing that isn't made public is exactly how the subroutines do 
what the y do. But once you know what a subroutine does, it's generally 
quite easy (if yo u are a hacker) to write one of your own that does 
exactly the same thing. It mi ght take a while, and it is tedious and 
unrewarding, but in most cases it's not really hard.

What's hard, in hacking as in fiction, is not writing; it's deciding what 
to wri te. And the vendors of commercial OSes have already decided, and 
published their
 decisions.

This has been generally understood for a long time. MS-DOS was duplicated, 
funct ionally, by a rival product, written from scratch, called ProDOS, 
that did all o f the same things in pretty much the same way. In other 
words, another company w as able to write code that did all of the same 
things as MS-DOS and sell it at a
 profit. If you are using the Linux OS, you can get a free program called 
WINE w hich is a windows emulator; that is, you can open up a window on 
your desktop th at runs windows programs. It means that a completely 
functional Windows OS has b een recreated inside of Unix, like a ship in a 
bottle. And Unix itself, which is
 vastly more sophisticated than MS-DOS, has been built up from scratch 
many time s over. Versions of it are sold by Sun, Hewlett-Packard, AT&T, 
Silicon Graphics,
 IBM, and others.

People have, in other words, been re-writing basic OS code for so long 
that all of the technology that constituted an "operating system" in the 
traditional (pre -GUI) sense of that phrase is now so cheap and common 
that it's literally free. Not only could Gates and Allen not sell MS-DOS 
today, they could not even give i t away, because much more powerful OSes 
are already being given away. Even the o riginal Windows (which was the 
only windows until 1995) has become worthless, in
 that there is no point in owning something that can be emulated inside of 
Linux --which is, itself, free.

In this way the OS business is very different from, say, the car business. 
Even an old rundown car has some value. You can use it for making runs to 
the dump, o r strip it for parts. It is the fate of manufactured goods to 
slowly and gently depreciate as they get old and have to compete against 
more modern products.

But it is the fate of operating systems to become free.

Microsoft is a great software applications company. Applications--such as 
Micros oft Word--are an area where innovation brings real, direct, 
tangible benefits to
 users. The innovations might be new technology straight from the research 
depar tment, or they might be in the category of bells and whistles, but 
in any event they are frequently useful and they seem to make users happy. 
And Microsoft is i n the process of becoming a great research company. But 
Microsoft is not such a great operating systems company. And this is not 
necessarily because their opera ting systems are all that bad from a 
purely technological standpoint. Microsoft' s OSes do have their problems, 
sure, but they are vastly better than they used t o be, and they are 
adequate for most people.

Why, then, do I say that Microsoft is not such a great operating systems 
company ? Because the very nature of operating systems is such that it is 
senseless for them to be developed and owned by a specific company. It's a 
thankless job to be gin with. Applications create possibilities for 
millions of credulous users, whe reas OSes impose limitations on thousands 
of grumpy coders, and so OS-makers wil l forever be on the shit-list of 
anyone who counts for anything in the high-tech
 world. Applications get used by people whose big problem is understanding 
all o f their features, whereas OSes get hacked by coders who are annoyed 
by their lim itations. The OS business has been good to Microsoft only 
insofar as it has give n them the money they needed to launch a really 
good applications software busin ess and to hire a lot of smart 
researchers. Now it really ought to be jettisoned , like a spent booster 
stage from a rocket. The big question is whether Microsof t is capable of 
doing this. Or is it addicted to OS sales in the same way as App le is to 
selling hardware?

Keep in mind that Apple's ability to monopolize its own hardware supply 
was once
 cited, by learned observers, as a great advantage over Microsoft. At the 
time, it seemed to place them in a much stronger position. In the end, it 
nearly kille d them, and may kill them yet. The problem, for Apple, was 
that most of the worl d's computer users ended up owning cheaper hardware. 
But cheap hardware couldn't
 run MacOS, and so these people switched to Windows.

Replace "hardware" with "operating systems," and "Apple" with "Microsoft" 
and yo u can see the same thing about to happen all over again. Microsoft 
dominates the
 OS market, which makes them money and seems like a great idea for now. 
But chea per and better OSes are available, and they are growingly popular 
in parts of th e world that are not so saturated with computers as the US. 
Ten years from now, most of the world's computer users may end up owning 
these cheaper OSes. But the se OSes do not, for the time being, run any 
Microsoft applications, and so these
 people will use something else.

To put it more directly: every time someone decides to use a non-Microsoft 
OS, M icrosoft's OS division, obviously, loses a customer. But, as things 
stand now, M icrosoft's applications division loses a customer too. This 
is not such a big de al as long as almost everyone uses Microsoft OSes. 
But as soon as Windows' marke t share begins to slip, the math starts to 
look pretty dismal for the people in Redmond.

This argument could be countered by saying that Microsoft could simply 
re-compil e its applications to run under other OSes. But this strategy 
goes against most normal corporate instincts. Again the case of Apple is 
instructive. When things started to go south for Apple, they should have 
ported their OS to cheap PC hard ware. But they didn't. Instead, they 
tried to make the most of their brilliant h ardware, adding new features 
and expanding the product line. But this only had t he effect of making 
their OS more dependent on these special hardware features, which made it 
worse for them in the end.

Likewise, when Microsoft's position in the OS world is threatened, their 
corpora te instincts will tell them to pile more new features into their 
operating syste ms, and then re-jigger their software applications to 
exploit those special feat ures. But this will only have the effect of 
making their applications dependent on an OS with declining market share, 
and make it worse for them in the end.

The operating system market is a death-trap, a tar-pit, a slough of 
despond. The re are only two reasons to invest in Apple and Microsoft. (1) 
each of these comp anies is in what we would call a co-dependency 
relationship with their customers . The customers Want To Believe, and 
Apple and Microsoft know how to give them w hat they want. (2) each 
company works very hard to add new features to their OSe s, which works to 
secure customer loyalty, at least for a little while.

Accordingly, most of the remainder of this essay will be about those two 
topics.

THE TECHNOSPHERE

Unix is the only OS remaining whose GUI (a vast suite of code called the X 
Windo ws System) is separate from the OS in the old sense of the phrase. 
This is to sa y that you can run Unix in pure command-line mode if you 
want to, with no window s, icons, mouses, etc. whatsoever, and it will 
still be Unix and capable of doin g everything Unix is supposed to do. But 
the other OSes: MacOS, the Windows fami ly, and BeOS, have their GUIs 
tangled up with the old-fashioned OS functions to the extent that they 
have to run in GUI mode, or else they are not really runnin g. So it's no 
longer really possible to think of GUIs as being distinct from the
 OS; they're now an inextricable part of the OSes that they belong to--and 
they are by far the largest part, and by far the most expensive and 
difficult part to
 create.

There are only two ways to sell a product: price and features. When OSes 
are fre e, OS companies cannot compete on price, and so they compete on 
features. This m eans that they are always trying to outdo each other 
writing code that, until re cently, was not considered to be part of an OS 
at all: stuff like GUIs. This exp lains a lot about how these companies 
behave.

It explains why Microsoft added a browser to their OS, for example. It is 
easy t o get free browsers, just as to get free OSes. If browsers are 
free, and OSes ar e free, it would seem that there is no way to make money 
from browsers or OSes. But if you can integrate a browser into the OS and 
thereby imbue both of them wi th new features, you have a salable product.

Setting aside, for the moment, the fact that this makes government 
anti-trust la wyers really mad, this strategy makes sense. At least, it 
makes sense if you ass ume (as Microsoft's management appears to) that the 
OS has to be protected at al l costs. The real question is whether every 
new technological trend that comes d own the pike ought to be used as a 
crutch to maintain the OS's dominant position . Confronted with the Web 
phenomenon, Microsoft had to develop a really good web
 browser, and they did. But then they had a choice: they could have made 
that br owser work on many different OSes, which would give Microsoft a 
strong position in the Internet world no matter what happened to their OS 
market share. Or they could make the browser one with the OS, gambling 
that this would make the OS loo k so modern and sexy that it would help to 
preserve their dominance in that mark et. The problem is that when 
Microsoft's OS position begins to erode (and since it is currently at 
something like ninety percent, it can't go anywhere but down)
 it will drag everything else down with it.

In your high school geology class you probably were taught that all life 
on eart h exists in a paper-thin shell called the biosphere, which is 
trapped between th ousands of miles of dead rock underfoot, and cold dead 
radioactive empty space a bove. Companies that sell OSes exist in a sort 
of technosphere. Underneath is te chnology that has already become free. 
Above is technology that has yet to be de veloped, or that is too crazy 
and speculative to be productized just yet. Like t he Earth's biosphere, 
the technosphere is very thin compared to what is above an d what is 
below.

But it moves a lot faster. In various parts of our world, it is possible 
to go a nd visit rich fossil beds where skeleton lies piled upon skeleton, 
recent ones o n top and more ancient ones below. In theory they go all the 
way back to the fir st single-celled organisms. And if you use your 
imagination a bit, you can under stand that, if you hang around long 
enough, you'll become fossilized there too, and in time some more advanced 
organism will become fossilized on top of you.

The fossil record--the La Brea Tar Pit--of software technology is the 
Internet. Anything that shows up there is free for the taking (possibly 
illegal, but free) . Executives at companies like Microsoft must get used 
to the experience--unthin kable in other industries--of throwing millions 
of dollars into the development of new technologies, such as Web browsers, 
and then seeing the same or equivalen t software show up on the Internet 
two years, or a year, or even just a few mont hs, later.

By continuing to develop new technologies and add features onto their 
products t hey can keep one step ahead of the fossilization process, but 
on certain days th ey must feel like mammoths caught at La Brea, using all 
their energies to pull t heir feet, over and over again, out of the 
sucking hot tar that wants to cover a nd envelop them.

Survival in this biosphere demands sharp tusks and heavy, stomping feet at 
one e nd of the organization, and Microsoft famously has those. But 
trampling the othe r mammoths into the tar can only keep you alive for so 
long. The danger is that in their obsession with staying out of the fossil 
beds, these companies will for get about what lies above the biosphere: 
the realm of new technology. In other w ords, they must hang onto their 
primitive weapons and crude competitive instinct s, but also evolve 
powerful brains. This appears to be what Microsoft is doing w ith its 
research division, which has been hiring smart people right and left (He 
re I should mention that although I know, and socialize with, several 
people in that company's research division, we never talk about business 
issues and I have
 little to no idea what the hell they are up to. I have learned much more 
about Microsoft by using the Linux operating system than I ever would have 
done by usi ng Windows).

Never mind how Microsoft used to make money; today, it is making its money 
on a kind of temporal arbitrage. "Arbitrage," in the usual sense, means to 
make money
 by taking advantage of differences in the price of something between 
different markets. It is spatial, in other words, and hinges on the 
arbitrageur knowing wh at is going on simultaneously in different places. 
Microsoft is making money by taking advantage of differences in the price 
of technology in different times. T emporal arbitrage, if I may coin a 
phrase, hinges on the arbitrageur knowing wha t technologies people will 
pay money for next year, and how soon afterwards thos e same technologies 
will become free. What spatial and temporal arbitrage have i n common is 
that both hinge on the arbitrageur's being extremely well-informed; one 
about price gradients across space at a given time, and the other about 
pric e gradients over time in a given place.

So Apple/Microsoft shower new features upon their users almost daily, in 
the hop es that a steady stream of genuine technical innovations, combined 
with the "I w ant to believe" phenomenon, will prevent their customers 
from looking across the
 road towards the cheaper and better OSes that are available to them. The 
questi on is whether this makes sense in the long run. If Microsoft is 
addicted to OSes
 as Apple is to hardware, then they will bet the whole farm on their OSes, 
and t ie all of their new applications and technologies to them. Their 
continued survi val will then depend on these two things: adding more 
features to their OSes so that customers will not switch to the cheaper 
alternatives, and maintaining the image that, in some mysterious way, 
gives those customers the feeling that they are getting something for 
their money.

The latter is a truly strange and interesting cultural phenomenon.

THE INTERFACE CULTURE


A few years ago I walked into a grocery store somewhere and was presented 
with t he following tableau vivant: near the entrance a young couple were 
standing in f ront of a large cosmetics display. The man was stolidly 
holding a shopping baske t between his hands while his mate raked 
blister-packs of makeup off the display
 and piled them in. Since then I've always thought of that man as the 
personific ation of an interesting human tendency: not only are we not 
offended to be dazzl ed by manufactured images, but we like it. We 
practically insist on it. We are e ager to be complicit in our own 
dazzlement: to pay money for a theme park ride, vote for a guy who's 
obviously lying to us, or stand there holding the basket as
 it's filled up with cosmetics.

I was in Disney World recently, specifically the part of it called the 
Magic Kin gdom, walking up Main Street USA. This is a perfect gingerbready 
Victorian small
 town that culminates in a Disney castle. It was very crowded; we shuffled 
rathe r than walked. Directly in front of me was a man with a camcorder. 
It was one of
 the new breed of camcorders where instead of peering through a viewfinder 
you g aze at a flat-panel color screen about the size of a playing card, 
which televis es live coverage of whatever the camcorder is seeing. He was 
holding the applian ce close to his face, so that it obstructed his view. 
Rather than go see a real small town for free, he had paid money to see a 
pretend one, and rather than see
 it with the naked eye he was watching it on television.

And rather than stay home and read a book, I was watching him.

Americans' preference for mediated experiences is obvious enough, and I'm 
not go ing to keep pounding it into the ground. I'm not even going to make 
snotty comme nts about it--after all, I was at Disney World as a paying 
customer. But it clea rly relates to the colossal success of GUIs and so I 
have to talk about it some.
 Disney does mediated experiences better than anyone. If they understood 
what OS es are, and why people use them, they could crush Microsoft in a 
year or two.

In the part of Disney World called the Animal Kingdom there is a new 
attraction,
 slated to open in March 1999, called the Maharajah Jungle Trek. It was 
 open for sneak previews when I was there. This is a complete 
 stone-by-stone reproduction of a hypothetical ruin in the jungles of 
 India. According to its backstory, it
was built by a local rajah in the 16th Century as a game reserve. He would 
go th ere with his princely guests to hunt Bengal tigers. As time went on 
it fell into
 disrepair and the tigers and monkeys took it over; eventually, around the 
time of India's independence, it became a government wildlife reserve, now 
open to vi sitors.

The place looks more like what I have just described than any actual 
building yo u might find in India. All the stones in the broken walls are 
weathered as if mo nsoon rains had been trickling down them for centuries, 
the paint on the gorgeou s murals is flaked and faded just so, and Bengal 
tigers loll amid stumps of brok en columns. Where modern repairs have been 
made to the ancient structure, they'v e been done, not as Disney's 
engineers would do them, but as thrifty Indian jani tors would--with hunks 
of bamboo and rust-spotted hunks of rebar. The rust is pa inted on, or 
course, and protected from real rust by a plastic clear-coat, but y ou 
can't tell unless you get down on your knees.

In one place you walk along a stone wall with a series of old pitted 
friezes car ved into it. One end of the wall has broken off and settled 
into the earth, perh aps because of some long-forgotten earthquake, and so 
a broad jagged crack runs across a panel or two, but the story is still 
readable: first, primordial chaos leads to a flourishing of many animal 
species. Next, we see the Tree of Life sur rounded by diverse animals. 
This is an obvious allusion (or, in showbiz lingo, a
 tie-in) to the gigantic Tree of Life that dominates the center of 
Disney's Anim al Kingdom just as the Castle dominates the Magic Kingdom or 
the Sphere does Epc ot. But it's rendered in historically correct style 
and could probably fool anyo ne who didn't have a Ph.D. in Indian art 
history.

The next panel shows a mustachioed H. sapiens chopping down the Tree of 
Life wit h a scimitar, and the animals fleeing every which way. The one 
after that shows the misguided human getting walloped by a tidal wave, 
part of a latter-day Delug e presumably brought on by his stupidity.

The final panel, then, portrays the Sapling of Life beginning to grow 
back, but now Man has ditched the edged weapon and joined the other 
animals in standing ar ound to adore and praise it.

It is, in other words, a prophecy of the Bottleneck: the scenario, 
commonly espo used among modern-day environmentalists, that the world 
faces an upcoming period
 of grave ecological tribulations that will last for a few decades or 
centuries and end when we find a new harmonious modus vivendi with Nature.

Taken as a whole the frieze is a pretty brilliant piece of work. Obviously 
it's not an ancient Indian ruin, and some person or people now living 
deserve credit for it. But there are no signatures on the Maharajah's game 
reserve at Disney Wo rld. There are no signatures on anything, because it 
would ruin the whole effect
 to have long strings of production credits dangling from every 
custom-worn bric k, as they do from Hollywood movies.

Among Hollywood writers, Disney has the reputation of being a real wicked 
stepmo ther. It's not hard to see why. Disney is in the business of 
putting out a produ ct of seamless illusion--a magic mirror that reflects 
the world back better than
 it really is. But a writer is literally talking to his or her readers, 
 not just creating an ambience or presenting them with something to look 
 at; and just as
the command-line interface opens a much more direct and explicit channel 
from us er to machine than the GUI, so it is with words, writer, and 
reader.

The word, in the end, is the only system of encoding thoughts--the only 
medium-- that is not fungible, that refuses to dissolve in the devouring 
torrent of elect ronic media (the richer tourists at Disney World wear 
t-shirts printed with the names of famous designers, because designs 
themselves can be bootlegged easily a nd with impunity. The only way to 
make clothing that cannot be legally bootlegge d is to print copyrighted 
and trademarked words on it; once you have taken that step, the clothing 
itself doesn't really matter, and so a t-shirt is as good as anything 
else. T-shirts with expensive words on them are now the insignia of the
 upper class. T-shirts with cheap words, or no words at all, are for the 
commone rs).

But this special quality of words and of written communication would have 
the sa me effect on Disney's product as spray-painted graffiti on a magic 
mirror. So Di sney does most of its communication without resorting to 
words, and for the most
 part, the words aren't missed. Some of Disney's older properties, such as 
 Peter Pan, Winnie the Pooh, and Alice in Wonderland, came out of books. 
 But the autho
rs' names are rarely if ever mentioned, and you can't buy the original 
books at the Disney store. If you could, they would all seem old and 
queer, like very bad
 knockoffs of the purer, more authentic Disney versions. Compared to more 
 recent productions like Beauty and the Beast and Mulan, the Disney movies 
 based on the
se books (particularly Alice in Wonderland and Peter Pan) seem deeply 
bizarre, a nd not wholly appropriate for children. That stands to reason, 
because Lewis Car roll and J.M. Barrie were very strange men, and such is 
the nature of the writte n word that their personal strangeness shines 
straight through all the layers of
 Disneyfication like x-rays through a wall. Probably for this very reason, 
Disne y seems to have stopped buying books altogether, and now finds its 
themes and ch aracters in folk tales, which have the lapidary, time-worn 
quality of the ancien t bricks in the Maharajah's ruins.

If I can risk a broad generalization, most of the people who go to Disney 
World have zero interest in absorbing new ideas from books. Which sounds 
snide, but li sten: they have no qualms about being presented with ideas 
in other forms. Disne y World is stuffed with environmental messages now, 
and the guides at Animal Kin gdom can talk your ear off about biology.

If you followed those tourists home, you might find art, but it would be 
the sor t of unsigned folk art that's for sale in Disney World's African- 
and Asian-them ed stores. In general they only seem comfortable with media 
that have been ratif ied by great age, massive popular acceptance, or 
both.

In this world, artists are like the anonymous, illiterate stone carvers 
who buil t the great cathedrals of Europe and then faded away into 
unmarked graves in the
 churchyard. The cathedral as a whole is awesome and stirring in spite, 
and poss ibly because, of the fact that we have no idea who built it. When 
we walk throug h it we are communing not with individual stone carvers but 
with an entire cultu re.

Disney World works the same way. If you are an intellectual type, a reader 
or wr iter of books, the nicest thing you can say about this is that the 
execution is superb. But it's easy to find the whole environment a little 
creepy, because som ething is missing: the translation of all its content 
into clear explicit writte n words, the attribution of the ideas to 
specific people. You can't argue with i t. It seems as if a hell of a lot 
might be being glossed over, as if Disney Worl d might be putting one over 
on us, and possibly getting away with all kinds of b uried assumptions and 
muddled thinking.

But this is precisely the same as what is lost in the transition from the 
comman d-line interface to the GUI.

Disney and Apple/Microsoft are in the same business: short-circuiting 
laborious,
 explicit verbal communication with expensively designed interfaces. 
 Disney is a sort of user interface unto itself--and more than just 
 graphical. Let's call it a Sensorial Interface. It can be applied to 
 anything in the world, real or imag
ined, albeit at staggering expense.

Why are we rejecting explicit word-based interfaces, and embracing 
graphical or sensorial ones--a trend that accounts for the success of both 
Microsoft and Disn ey?

Part of it is simply that the world is very complicated now--much more 
complicat ed than the hunter-gatherer world that our brains evolved to 
cope with--and we s imply can't handle all of the details. We have to 
delegate. We have no choice bu t to trust some nameless artist at Disney 
or programmer at Apple or Microsoft to
 make a few choices for us, close off some options, and give us a 
conveniently p ackaged executive summary.

But more importantly, it comes out of the fact that, during this century, 
intell ectualism failed, and everyone knows it. In places like Russia and 
Germany, the common people agreed to loosen their grip on traditional 
folkways, mores, and re ligion, and let the intellectuals run with the 
ball, and they screwed everything
 up and turned the century into an abbatoir. Those wordy intellectuals 
used to b e merely tedious; now they seem kind of dangerous as well.

We Americans are the only ones who didn't get creamed at some point during 
all o f this. We are free and prosperous because we have inherited 
political and value s systems fabricated by a particular set of 
eighteenth-century intellectuals who
 happened to get it right. But we have lost touch with those 
intellectuals, and with anything like intellectualism, even to the point 
of not reading books any m ore, though we are literate. We seem much more 
comfortable with propagating thos e values to future generations 
nonverbally, through a process of being steeped i n media. Apparently this 
actually works to some degree, for police in many lands
 are now complaining that local arrestees are insisting on having their 
Miranda rights read to them, just like perps in American TV cop shows. 
When it's explain ed to them that they are in a different country, where 
those rights do not exist , they become outraged. Starsky and Hutch 
reruns, dubbed into diverse languages,
 may turn out, in the long run, to be a greater force for human rights 
than the Declaration of Independence.

A huge, rich, nuclear-tipped culture that propagates its core values 
through med ia steepage seems like a bad idea. There is an obvious risk of 
running astray he re. Words are the only immutable medium we have, which 
is why they are the vehic le of choice for extremely important concepts 
like the Ten Commandments, the Kor an, and the Bill of Rights. Unless the 
messages conveyed by our media are someho w pegged to a fixed, written set 
of precepts, they can wander all over the place
 and possibly dump loads of crap into people's minds.

Orlando used to have a military installation called McCoy Air Force Base, 
with l ong runways from which B-52s could take off and reach Cuba, or just 
about anywhe re else, with loads of nukes. But now McCoy has been scrapped 
and repurposed. It
 has been absorbed into Orlando's civilian airport. The long runways are 
being u sed to land 747-loads of tourists from Brazil, Italy, Russia and 
Japan, so that they can come to Disney World and steep in our media for a 
while.

To traditional cultures, especially word-based ones such as Islam, this is 
infin itely more threatening than the B-52s ever were. It is obvious, to 
everyone outs ide of the United States, that our arch-buzzwords, 
multiculturalism and diversit y, are false fronts that are being used (in 
many cases unwittingly) to conceal a
 global trend to eradicate cultural differences. The basic tenet of 
multicultura lism (or "honoring diversity" or whatever you want to call 
it) is that people ne ed to stop judging each other-to stop asserting 
(and, eventually, to stop believ ing) that this is right and that is 
wrong, this true and that false, one thing u gly and another thing 
beautiful, that God exists and has this or that set of qua lities.

The lesson most people are taking home from the Twentieth Century is that, 
in or der for a large number of different cultures to coexist peacefully 
on the globe (or even in a neighborhood) it is necessary for people to 
suspend judgment in th is way. Hence (I would argue) our suspicion of, and 
hostility towards, all autho rity figures in modern culture. As David 
Foster Wallace has explained in his ess ay "E Unibus Pluram," this is the 
fundamental message of television; it is the m essage that people take 
home, anyway, after they have steeped in our media long enough. It's not 
expressed in these highfalutin terms, of course. It comes throu gh as the 
presumption that all authority figures--teachers, generals, cops, mini 
sters, politicians--are hypocritical buffoons, and that hip jaded coolness 
is th e only way to be.

The problem is that once you have done away with the ability to make 
judgments a s to right and wrong, true and false, etc., there's no real 
culture left. All th at remains is clog dancing and macrame. The ability 
to make judgments, to believ e things, is the entire it point of having a 
culture. I think this is why guys w ith machine guns sometimes pop up in 
places like Luxor, and begin pumping bullet s into Westerners. They 
perfectly understand the lesson of McCoy Air Force Base.
 When their sons come home wearing Chicago Bulls caps with the bills 
turned side ways, the dads go out of their minds.

The global anti-culture that has been conveyed into every cranny of the 
world by
 television is a culture unto itself, and by the standards of great and 
ancient cultures like Islam and France, it seems grossly inferior, at 
least at first. Th e only good thing you can say about it is that it makes 
world wars and Holocaust s less likely--and that is actually a pretty good 
thing!

The only real problem is that anyone who has no culture, other than this 
global monoculture, is completely screwed. Anyone who grows up watching 
TV, never sees any religion or philosophy, is raised in an atmosphere of 
moral relativism, lear ns about civics from watching bimbo eruptions on 
network TV news, and attends a university where postmodernists vie to 
outdo each other in demolishing tradition al notions of truth and quality, 
is going to come out into the world as one pret ty feckless human being. 
And--again--perhaps the goal of all this is to make us feckless so we 
won't nuke each other.

On the other hand, if you are raised within some specific culture, you end 
up wi th a basic set of tools that you can use to think about and 
understand the world . You might use those tools to reject the culture you 
were raised in, but at lea st you've got some tools.

In this country, the people who run things--who populate major law firms 
and cor porate boards--understand all of this at some level. They pay lip 
service to mul ticulturalism and diversity and non-judgmentalness, but 
they don't raise their o wn children that way. I have highly educated, 
technically sophisticated friends who have moved to small towns in Iowa to 
live and raise their children, and ther e are Hasidic Jewish enclaves in 
New York where large numbers of kids are being brought up according to 
traditional beliefs. Any suburban community might be tho ught of as a 
place where people who hold certain (mostly implicit) beliefs go to
 live among others who think the same way.

And not only do these people feel some responsibility to their own 
children, but
 to the country as a whole. Some of the upper class are vile and cynical, 
of cou rse, but many spend at least part of their time fretting about what 
direction th e country is going in, and what responsibilities they have. 
And so issues that a re important to book-reading intellectuals, such as 
global environmental collaps e, eventually percolate through the porous 
buffer of mass culture and show up as
 ancient Hindu ruins in Orlando.

You may be asking: what the hell does all this have to do with operating 
systems ? As I've explained, there is no way to explain the domination of 
the OS market by Apple/Microsoft without looking to cultural explanations, 
and so I can't get anywhere, in this essay, without first letting you know 
where I'm coming from vi s-a-vis contemporary culture.

Contemporary culture is a two-tiered system, like the Morlocks and the 
Eloi in H .G. Wells's The Time Machine, except that it's been turned 
upside down. In The T ime Machine the Eloi were an effete upper class, 
supported by lots of subterrane an Morlocks who kept the technological 
wheels turning. But in our world it's the
 other way round. The Morlocks are in the minority, and they are running 
the sho w, because they understand how everything works. The much more 
numerous Eloi lea rn everything they know from being steeped from birth in 
electronic media direct ed and controlled by book-reading Morlocks. So 
many ignorant people could be dan gerous if they got pointed in the wrong 
direction, and so we've evolved a popula r culture that is (a) almost 
unbelievably infectious and (b) neuters every perso n who gets infected by 
it, by rendering them unwilling to make judgments and inc apable of taking 
stands.

Morlocks, who have the energy and intelligence to comprehend details, go 
out and
 master complex subjects and produce Disney-like Sensorial Interfaces so 
that El oi can get the gist without having to strain their minds or endure 
boredom. Thos e Morlocks will go to India and tediously explore a hundred 
ruins, then come hom e and built sanitary bug-free versions: highlight 
films, as it were. This costs a lot, because Morlocks insist on good 
coffee and first-class airline tickets, b ut that's no problem because 
Eloi like to be dazzled and will gladly pay for it all.

Now I realize that most of this probably sounds snide and bitter to the 
point of
 absurdity: your basic snotty intellectual throwing a tantrum about those 
unlett ered philistines. As if I were a self-styled Moses, coming down 
from the mountai n all alone, carrying the stone tablets bearing the Ten 
Commandments carved in i mmutable stone--the original command-line 
interface--and blowing his stack at th e weak, unenlightened Hebrews 
worshipping images. Not only that, but it sounds l ike I'm pumping some 
sort of conspiracy theory.

But that is not where I'm going with this. The situation I describe, here, 
could
 be bad, but doesn't have to be bad and isn't necessarily bad now:


It simply is the case that we are way too busy, nowadays, to comprehend 
everythi ng in detail. And it's better to comprehend it dimly, through an 
interface, than
 not at all. Better for ten million Eloi to go on the Kilimanjaro Safari 
at Disn ey World than for a thousand cardiovascular surgeons and mutual 
fund managers to
 go on "real" ones in Kenya. The boundary between these two classes is 
more poro us than I've made it sound. I'm always running into regular 
dudes--construction workers, auto mechanics, taxi drivers, galoots in 
general--who were largely alit erate until something made it necessary for 
them to become readers and start act ually thinking about things. Perhaps 
they had to come to grips with alcoholism, perhaps they got sent to jail, 
or came down with a disease, or suffered a crisis
 in religious faith, or simply got bored. Such people can get up to speed 
on par ticular subjects quite rapidly. Sometimes their lack of a broad 
education makes them over-apt to go off on intellectual wild goose chases, 
but, hey, at least a wild goose chase gives you some exercise. The spectre 
of a polity controlled by the fads and whims of voters who actually 
believe that there are significant dif ferences between Bud Lite and 
Miller Lite, and who think that professional wrest ling is for real, is 
naturally alarming to people who don't. But then countries controlled via 
the command-line interface, as it were, by double-domed intellect uals, be 
they religious or secular, are generally miserable places to live. Soph 
isticated people deride Disneyesque entertainments as pat and saccharine, 
but, h ey, if the result of that is to instill basically warm and 
sympathetic reflexes,
 at a preverbal level, into hundreds of millions of unlettered 
media-steepers, t hen how bad can it be? We killed a lobster in our 
kitchen last night and my daug hter cried for an hour. The Japanese, who 
used to be just about the fiercest peo ple on earth, have become 
infatuated with cuddly adorable cartoon characters. My
 own family--the people I know best--is divided about evenly between 
people who will probably read this essay and people who almost certainly 
won't, and I can't
 say for sure that one group is necessarily warmer, happier, or 
better-adjusted than the other.

MORLOCKS AND ELOI AT THE KEYBOARD


Back in the days of the command-line interface, users were all Morlocks 
who had to convert their thoughts into alphanumeric symbols and type them 
in, a grinding ly tedious process that stripped away all ambiguity, laid 
bare all hidden assump tions, and cruelly punished laziness and 
imprecision. Then the interface-makers went to work on their GUIs, and 
introduced a new semiotic layer between people a nd machines. People who 
use such systems have abdicated the responsibility, and surrendered the 
power, of sending bits directly to the chip that's doing the ari thmetic, 
and handed that responsibility and power over to the OS. This is tempti ng 
because giving clear instructions, to anyone or anything, is difficult. We 
ca nnot do it without thinking, and depending on the complexity of the 
situation, w e may have to think hard about abstract things, and consider 
any number of ramif ications, in order to do a good job of it. For most of 
us, this is hard work. We
 want things to be easier. How badly we want it can be measured by the 
size of B ill Gates's fortune.

The OS has (therefore) become a sort of intellectual labor-saving device 
that tr ies to translate humans' vaguely expressed intentions into bits. 
In effect we ar e asking our computers to shoulder responsibilities that 
have always been consid ered the province of human beings--we want them to 
understand our desires, to an ticipate our needs, to foresee consequences, 
to make connections, to handle rout ine chores without being asked, to 
remind us of what we ought to be reminded of while filtering out noise.

At the upper (which is to say, closer to the user) levels, this is done 
through a set of conventions--menus, buttons, and so on. These work in the 
sense that an alogies work: they help Eloi understand abstract or 
unfamiliar concepts by liken ing them to something known. But the loftier 
word "metaphor" is used.

The overarching concept of the MacOS was the "desktop metaphor" and it 
subsumed any number of lesser (and frequently conflicting, or at least 
mixed) metaphors. Under a GUI, a file (frequently called "document") is 
metaphrased as a window on
 the screen (which is called a "desktop"). The window is almost always too 
 small to contain the document and so you "move around," or, more 
 pretentiously, "navi
gate" in the document by "clicking and dragging" the "thumb" on the 
"scroll bar. " When you "type" (using a keyboard) or "draw" (using a 
"mouse") into the "windo w" or use pull-down "menus" and "dialog boxes" to 
manipulate its contents, the r esults of your labors get stored (at least 
in theory) in a "file," and later you
 can pull the same information back up into another "window." When you 
don't wan t it anymore, you "drag" it into the "trash."

There is massively promiscuous metaphor-mixing going on here, and I could 
decons truct it 'til the cows come home, but I won't. Consider only one 
word: "document ." When we document something in the real world, we make 
fixed, permanent, immut able records of it. But computer documents are 
volatile, ephemeral constellation s of data. Sometimes (as when you've 
just opened or saved them) the document as portrayed in the window is 
identical to what is stored, under the same name, in a file on the disk, 
but other times (as when you have made changes without savin g them) it is 
completely different. In any case, every time you hit "Save" you a 
nnihilate the previous version of the "document" and replace it with 
whatever ha ppens to be in the window at the moment. So even the word 
"save" is being used i n a sense that is grotesquely misleading---"destroy 
one version, save another" w ould be more accurate.

Anyone who uses a word processor for very long inevitably has the 
experience of putting hours of work into a long document and then losing 
it because the comput er crashes or the power goes out. Until the moment 
that it disappears from the s creen, the document seems every bit as solid 
and real as if it had been typed ou t in ink on paper. But in the next 
moment, without warning, it is completely and
 irretrievably gone, as if it had never existed. The user is left with a 
 feeling of disorientation (to say nothing of annoyance) stemming from a 
 kind of metapho
r shear--you realize that you've been living and thinking inside of a 
metaphor t hat is essentially bogus.

So GUIs use metaphors to make computing easier, but they are bad 
metaphors. Lear ning to use them is essentially a word game, a process of 
learning new definitio ns of words like "window" and "document" and "save" 
that are different from, and
 in many cases almost diametrically opposed to, the old. Somewhat 
improbably, th is has worked very well, at least from a commercial 
standpoint, which is to say that Apple/Microsoft have made a lot of money 
off of it. All of the other modern
 operating systems have learned that in order to be accepted by users they 
must conceal their underlying gutwork beneath the same sort of spackle. 
This has some
 advantages: if you know how to use one GUI operating system, you can 
probably w ork out how to use any other in a few minutes. Everything works 
a little differe ntly, like European plumbing--but with some fiddling 
around, you can type a memo
 or surf the web.

Most people who shop for OSes (if they bother to shop at all) are 
comparing not the underlying functions but the superficial look and feel. 
The average buyer of
 an OS is not really paying for, and is not especially interested in, the 
low-le vel code that allocates memory or writes bytes onto the disk. What 
we're really buying is a system of metaphors. And--much more 
important--what we're buying int o is the underlying assumption that 
metaphors are a good way to deal with the wo rld.

Recently a lot of new hardware has become available that gives computers 
numerou s interesting ways of affecting the real world: making paper spew 
out of printer s, causing words to appear on screens thousands of miles 
away, shooting beams of
 radiation through cancer patients, creating realistic moving pictures of 
the Ti tanic. Windows is now used as an OS for cash registers and bank 
tellers' termina ls. My satellite TV system uses a sort of GUI to change 
channels and show progra m guides. Modern cellular telephones have a crude 
GUI built into a tiny LCD scre en. Even Legos now have a GUI: you can buy 
a Lego set called Mindstorms that ena bles you to build little Lego robots 
and program them through a GUI on your comp uter.

So we are now asking the GUI to do a lot more than serve as a glorified 
typewrit er. Now we want to become a generalized tool for dealing with 
reality. This has become a bonanza for companies that make a living out of 
bringing new technology
 to the mass market.

Obviously you cannot sell a complicated technological system to people 
without s ome sort of interface that enables them to use it. The internal 
combustion engin e was a technological marvel in its day, but useless as a 
consumer good until a clutch, transmission, steering wheel and throttle 
were connected to it. That odd
 collection of gizmos, which survives to this day in every car on the 
 road, made up what we would today call a user interface. But if cars had 
 been invented aft
er Macintoshes, carmakers would not have bothered to gin up all of these 
arcane devices. We would have a computer screen instead of a dashboard, 
and a mouse (or
 at best a joystick) instead of a steering wheel, and we'd shift gears by 
pullin g down a menu:

PARK --- REVERSE --- NEUTRAL ---- 3 2 1 --- Help...

A few lines of computer code can thus be made to substitute for any 
imaginable m echanical interface. The problem is that in many cases the 
substitute is a poor one. Driving a car through a GUI would be a miserable 
experience. Even if the GU I were perfectly bug-free, it would be 
incredibly dangerous, because menus and b uttons simply can't be as 
responsive as direct mechanical controls. My friend's dad, the gentleman 
who was restoring the MGB, never would have bothered with it if it had 
been equipped with a GUI. It wouldn't have been any fun.

The steering wheel and gearshift lever were invented during an era when 
the most
 complicated technology in most homes was a butter churn. Those early 
carmakers were simply lucky, in that they could dream up whatever 
interface was best suite d to the task of driving an automobile, and 
people would learn it. Likewise with
 the dial telephone and the AM radio. By the time of the Second World War, 
most people knew several interfaces: they could not only churn butter but 
also drive a car, dial a telephone, turn on a radio, summon flame from a 
cigarette lighter,
 and change a light bulb.

But now every little thing--wristwatches, VCRs, stoves--is jammed with 
features,
 and every feature is useless without an interface. If you are like me, 
 and like most other consumers, you have never used ninety percent of the 
 available featu
res on your microwave oven, VCR, or cellphone. You don't even know that 
these fe atures exist. The small benefit they might bring you is 
outweighed by the sheer hassle of having to learn about them. This has got 
to be a big problem for maker s of consumer goods, because they can't 
compete without offering features.

It's no longer acceptable for engineers to invent a wholly novel user 
interface for every new product, as they did in the case of the 
automobile, partly because
 it's too expensive and partly because ordinary people can only learn so 
much. I f the VCR had been invented a hundred years ago, it would have 
come with a thumb wheel to adjust the tracking and a gearshift to change 
between forward and rever se and a big cast-iron handle to load or to 
eject the cassettes. It would have h ad a big analog clock on the front of 
it, and you would have set the time by mov ing the hands around on the 
dial. But because the VCR was invented when it was-- during a sort of 
awkward transitional period between the era of mechanical inter faces and 
GUIs--it just had a bunch of pushbuttons on the front, and in order to
 set the time you had to push the buttons in just the right way. This must 
have seemed reasonable enough to the engineers responsible for it, but to 
many users it was simply impossible. Thus the famous blinking 12:00 that 
appears on so many
 VCRs. Computer people call this "the blinking twelve problem". When they 
talk a bout it, though, they usually aren't talking about VCRs.

Modern VCRs usually have some kind of on-screen programming, which means 
that yo u can set the time and control other features through a sort of 
primitive GUI. G UIs have virtual pushbuttons too, of course, but they 
also have other types of v irtual controls, like radio buttons, 
checkboxes, text entry boxes, dials, and sc rollbars. Interfaces made out 
of these components seem to be a lot easier, for m any people, than 
pushing those little buttons on the front of the machine, and s o the 
blinking 12:00 itself is slowly disappearing from America's living rooms. 
The blinking twelve problem has moved on to plague other technologies.

So the GUI has gone beyond being an interface to personal computers, and 
become a sort of meta-interface that is pressed into service for every new 
piece of con sumer technology. It is rarely an ideal fit, but having an 
ideal, or even a good
 interface is no longer the priority; the important thing now is having 
some kin d of interface that customers will actually use, so that 
manufacturers can claim , with a straight face, that they are offering new 
features.

We want GUIs largely because they are convenient and because they are 
easy-- or at least the GUI makes it seem that way Of course, nothing is 
really easy and si mple, and putting a nice interface on top of it does 
not change that fact. A car
 controlled through a GUI would be easier to drive than one controlled 
through p edals and steering wheel, but it would be incredibly dangerous.

By using GUIs all the time we have insensibly bought into a premise that 
few peo ple would have accepted if it were presented to them bluntly: 
namely, that hard things can be made easy, and complicated things simple, 
by putting the right int erface on them. In order to understand how 
bizarre this is, imagine that book re views were written according to the 
same values system that we apply to user int erfaces: "The writing in this 
book is marvelously simple-minded and glib; the au thor glosses over 
complicated subjects and employs facile generalizations in alm ost every 
sentence. Readers rarely have to think, and are spared all of the diff 
iculty and tedium typically involved in reading old-fashioned books." As 
long as
 we stick to simple operations like setting the clocks on our VCRs, this 
is not so bad. But as we try to do more ambitious things with our 
technologies, we inev itably run into the problem of:


METAPHOR SHEAR


I began using Microsoft Word as soon as the first version was released 
around 19 85. After some initial hassles I found it to be a better tool 
than MacWrite, whi ch was its only competition at the time. I wrote a lot 
of stuff in early version s of Word, storing it all on floppies, and 
transferred the contents of all my fl oppies to my first hard drive, which 
I acquired around 1987. As new versions of Word came out I faithfully 
upgraded, reasoning that as a writer it made sense fo r me to spend a 
certain amount of money on tools.

Sometime in the mid-1980's I attempted to open one of my old, circa-1985 
Word do cuments using the version of Word then current: 6.0 It didn't 
work. Word 6.0 did
 not recognize a document created by an earlier version of itself. By 
 opening it as a text file, I was able to recover the sequences of letters 
 that made up the text of the document. My words were still there. But the 
 formatting had been ru
n through a log chipper--the words I'd written were interrupted by spates 
of emp ty rectangular boxes and gibberish.

Now, in the context of a business (the chief market for Word) this sort of 
thing
 is only an annoyance--one of the routine hassles that go along with using 
compu ters. It's easy to buy little file converter programs that will take 
care of thi s problem. But if you are a writer whose career is words, 
whose professional ide ntity is a corpus of written documents, this kind 
of thing is extremely disquiet ing. There are very few fixed assumptions 
in my line of work, but one of them is
 that once you have written a word, it is written, and cannot be 
unwritten. The ink stains the paper, the chisel cuts the stone, the stylus 
marks the clay, and something has irrevocably happened (my brother-in-law 
is a theologian who reads 3250-year-old cuneiform tablets--he can 
recognize the handwriting of particular scribes, and identify them by 
name). But word-processing software--particularly the sort that employs 
special, complex file formats--has the eldritch power to u nwrite things. 
A small change in file formats, or a few twiddled bits, and month s' or 
years' literary output can cease to exist.

Now this was technically a fault in the application (Word 6.0 for the 
Macintosh)
 not the operating system (MacOS 7 point something) and so the initial 
 target of my annoyance was the people who were responsible for Word. But. 
 On the other ha
nd, I could have chosen the "save as text" option in Word and saved all of 
my do cuments as simple telegrams, and this problem would not have arisen. 
Instead I h ad allowed myself to be seduced by all of those flashy 
formatting options that h adn't even existed until GUIs had come along to 
make them practicable. I had got ten into the habit of using them to make 
my documents look pretty (perhaps prett ier than they deserved to look; 
all of the old documents on those floppies turne d out to be more or less 
crap). Now I was paying the price for that self-indulge nce. Technology 
had moved on and found ways to make my documents look even prett ier, and 
the consequence of it was that all old ugly documents had ceased to exi 
st.

It was--if you'll pardon me for a moment's strange little fantasy--as if 
I'd gon e to stay at some resort, some exquisitely designed and 
art-directed hotel, plac ing myself in the hands of past masters of the 
Sensorial Interface, and had sat down in my room and written a story in 
ballpoint pen on a yellow legal pad, and when I returned from dinner, 
discovered that the maid had taken my work away and
 left behind in its place a quill pen and a stack of fine 
parchment--explaining that the room looked ever so much finer this way, 
and it was all part of a routi ne upgrade. But written on these sheets of 
paper, in flawless penmanship, were l ong sequences of words chosen at 
random from the dictionary. Appalling, sure, bu t I couldn't really lodge 
a complaint with the management, because by staying at
 this resort I had given my consent to it. I had surrendered my Morlock 
credenti als and become an Eloi.


LINUX

During the late 1980's and early 1990's I spent a lot of time programming 
Macint oshes, and eventually decided for fork over several hundred dollars 
for an Apple
 product called the Macintosh Programmer's Workshop, or MPW. MPW had 
competitors , but it was unquestionably the premier software development 
system for the Mac.
 It was what Apple's own engineers used to write Macintosh code. Given 
that MacO S was far more technologically advanced, at the time, than its 
competition, and that Linux did not even exist yet, and given that this 
was the actual program us ed by Apple's world-class team of creative 
engineers, I had high expectations. I t arrived on a stack of floppy disks 
about a foot high, and so there was plenty of time for my excitement to 
build during the endless installation process. The first time I launched 
MPW, I was probably expecting some kind of touch-feely mul timedia 
showcase. Instead it was austere, almost to the point of being intimidat 
ing. It was a scrolling window into which you could type simple, 
unformatted tex t. The system would then interpret these lines of text as 
commands, and try to e xecute them.

It was, in other words, a glass teletype running a command line interface. 
It ca me with all sorts of cryptic but powerful commands, which could be 
invoked by ty ping their names, and which I learned to use only gradually. 
It was not until a few years later, when I began messing around with Unix, 
that I understood that t he command line interface embodied in MPW was a 
re-creation of Unix.

In other words, the first thing that Apple's hackers had done when they'd 
got th e MacOS up and running--probably even before they'd gotten it up 
and running--wa s to re-create the Unix interface, so that they would be 
able to get some useful
 work done. At the time, I simply couldn't get my mind around this, but: 
as far as Apple's hackers were concerned, the Mac's vaunted Graphical User 
Interface wa s an impediment, something to be circumvented before the 
little toaster even cam e out onto the market.

Even before my Powerbook crashed and obliterated my big file in July 1995, 
there
 had been danger signs. An old college buddy of mine, who starts and runs 
high-t ech companies in Boston, had developed a commercial product using 
Macintoshes as
 the front end. Basically the Macs were high-performance graphics 
terminals, cho sen for their sweet user interface, giving users access to 
a large database of g raphical information stored on a network of much 
more powerful, but less user-fr iendly, computers. This fellow was the 
second person who turned me on to Macinto shes, by the way, and through 
the mid-1980's we had shared the thrill of being h igh-tech cognoscenti, 
using superior Apple technology in a world of DOS-using kn uckleheads. 
Early versions of my friend's system had worked well, he told me, bu t 
when several machines joined the network, mysterious crashes began to 
occur; s ometimes the whole network would just freeze. It was one of those 
bugs that coul d not be reproduced easily. Finally they figured out that 
these network crashes were triggered whenever a user, scanning the menus 
for a particular item, held d own the mouse button for more than a couple 
of seconds.

Fundamentally, the MacOS could only do one thing at a time. Drawing a menu 
on th e screen is one thing. So when a menu was pulled down, the Macintosh 
was not cap able of doing anything else until that indecisive user 
released the button.

This is not such a bad thing in a single-user, single-process machine 
(although it's a fairly bad thing), but it's no good in a machine that is 
on a network, be cause being on a network implies some kind of continual 
low-level interaction wi th other machines. By failing to respond to the 
network, the Mac caused a networ k-wide crash.

In order to work with other computers, and with networks, and with various 
diffe rent types of hardware, an OS must be incomparably more complicated 
and powerful
 than either MS-DOS or the original MacOS. The only way of connecting to 
the Int ernet that's worth taking seriously is PPP, the Point-to-Point 
Protocol, which ( never mind the details) makes your 
computer--temporarily--a full-fledged member of the Global Internet, with 
its own unique address, and various privileges, pow ers, and 
responsibilities appertaining thereunto. Technically it means your mach 
ine is running the TCP/IP protocol, which, to make a long story short, 
revolves around sending packets of data back and forth, in no particular 
order, and at un predictable times, according to a clever and elegant set 
of rules. But sending a
 packet of data is one thing, and so an OS that can only do one thing at a 
time cannot simultaneously be part of the Internet and do anything else. 
When TCP/IP was invented, running it was an honor reserved for Serious 
Computers--mainframes
 and high-powered minicomputers used in technical and commercial 
settings--and s o the protocol is engineered around the assumption that 
every computer using it is a serious machine, capable of doing many things 
at once. Not to put too fine a point on it, a Unix machine. Neither MacOS 
nor MS-DOS was originally built wit h that in mind, and so when the 
Internet got hot, radical changes had to be made .

When my Powerbook broke my heart, and when Word stopped recognizing my old 
files , I jumped to Unix. The obvious alternative to MacOS would have been 
Windows. I didn't really have anything against Microsoft, or Windows. But 
it was pretty obv ious, now, that old PC operating systems were 
overreaching, and showing the stra in, and, perhaps, were best avoided 
until they had learned to walk and chew gum at the same time.

The changeover took place on a particular day in the summer of 1995. I had 
been San Francisco for a couple of weeks, using my PowerBook to work on a 
document. T he document was too big to fit onto a single floppy, and so I 
hadn't made a back up since leaving home. The PowerBook crashed and wiped 
out the entire file.

It happened just as I was on my way out the door to visit a company called 
Elect ric Communities, which in those days was in Los Altos. I took my 
PowerBook with me. My friends at Electric Communities were Mac users who 
had all sorts of utili ty software for unerasing files and recovering from 
disk crashes, and I was cert ain I could get most of the file back.

As it turned out, two different Mac crash recovery utilities were unable 
to find
 any trace that my file had ever existed. It was completely and 
systematically w iped out. We went through that hard disk block by block 
and found disjointed fra gments of countless old, discarded, forgotten 
files, but none of what I wanted. The metaphor shear was especially brutal 
that day. It was sort of like watching the girl you've been in love with 
for ten years get killed in a car wreck, and t hen attending her autopsy, 
and learning that underneath the clothes and makeup s he was just flesh 
and blood.

I must have been reeling around the offices of Electric Communities in 
some kind
 of primal Jungian fugue, because at this moment three weirdly 
synchronistic thi ngs happened.

(1) Randy Farmer, a co-founder of the company, came in for a quick visit 
along w ith his family--he was recovering from back surgery at the time. 
He had some hot
 gossip: "Windows 95 mastered today." What this meant was that Microsoft's 
new o perating system had, on this day, been placed on a special compact 
disk known as
 a golden master, which would be used to stamp out a jintillion copies in 
prepar ation for its thunderous release a few weeks later. This news was 
received peevi shly by the staff of Electric Communities, including one 
whose office door was p lastered with the usual assortment of cartoons and 
novelties, e.g.

(2) a copy of a Dilbert cartoon in which Dilbert, the long-suffering 
corporate s oftware engineer, encounters a portly, bearded, hairy man of a 
certain age--a bi t like Santa Claus, but darker, with a certain edge 
about him. Dilbert recognize s this man, based upon his appearance and 
affect, as a Unix hacker, and reacts w ith a certain mixture of 
nervousness, awe, and hostility. Dilbert jabs weakly at
 the disturbing interloper for a couple of frames; the Unix hacker listens 
with a kind of infuriating, beatific calm, then, in the last frame, 
reaches into his pocket. "Here's a nickel, kid," he says, "go buy yourself 
a real computer."

(3) the owner of the door, and the cartoon, was one Doug Barnes. Barnes 
was know n to harbor certain heretical opinions on the subject of 
operating systems. Unli ke most Bay Area techies who revered the 
Macintosh, considering it to be a true hacker's machine, Barnes was fond 
of pointing out that the Mac, with its hermeti cally sealed architecture, 
was actually hostile to hackers, who are prone to tin kering and dogmatic 
about openness. By contrast, the IBM-compatible line of mach ines, which 
can easily be taken apart and plugged back together, was much more h 
ackable.

So when I got home I began messing around with Linux, which is one of 
many, many
 different concrete implementations of the abstract, Platonic ideal called 
 Unix. I was not looking forward to changing over to a new OS, because my 
 credit cards were still smoking from all the money I'd spent on Mac 
 hardware over the years. But Linux's great virtue was, and is, that it 
 would run on exactly the same sor
t of hardware as the Microsoft OSes--which is to say, the cheapest 
hardware in e xistence. As if to demonstrate why this was a great idea, I 
was, within a week o r two of returning home, able to get my hand on a 
then-decent computer (a 33-MHz
 486 box) for free, because I knew a guy who worked in an office where 
 they were simply being thrown away. Once I got it home, I yanked the hood 
 off, stuck my h
ands in, and began switching cards around. If something didn't work, I 
went to a
 used-computer outlet and pawed through a bin full of components and 
bought a ne w card for a few bucks.

The availability of all this cheap but effective hardware was an 
unintended cons equence of decisions that had been made more than a decade 
earlier by IBM and Mi crosoft. When Windows came out, and brought the GUI 
to a much larger market, the
 hardware regime changed: the cost of color video cards and 
high-resolution moni tors began to drop, and is dropping still. This 
free-for-all approach to hardwar e meant that Windows was unavoidably 
clunky compared to MacOS. But the GUI broug ht computing to such a vast 
audience that volume went way up and prices collapse d. Meanwhile Apple, 
which so badly wanted a clean, integrated OS with video neat ly integrated 
into processing hardware, had fallen far behind in market share, a t least 
partly because their beautiful hardware cost so much.

But the price that we Mac owners had to pay for superior aesthetics and 
engineer ing was not merely a financial one. There was a cultural price 
too, stemming fro m the fact that we couldn't open up the hood and mess 
around with it. Doug Barne s was right. Apple, in spite of its reputation 
as the machine of choice of scruf fy, creative hacker types, had actually 
created a machine that discouraged hacki ng, while Microsoft, viewed as a 
technological laggard and copycat, had created a vast, disorderly parts 
bazaar--a primordial soup that eventually self-assemble d into Linux.


THE HOLE HAWG OF OPERATING SYSTEMS


Unix has always lurked provocatively in the background of the operating 
system w ars, like the Russian Army. Most people know it only by 
reputation, and its repu tation, as the Dilbert cartoon suggests, is 
mixed. But everyone seems to agree t hat if it could only get its act 
together and stop surrendering vast tracts of r ich agricultural land and 
hundreds of thousands of prisoners of war to the onrus hing invaders, it 
could stomp them (and all other opposition) flat.

It is difficult to explain how Unix has earned this respect without going 
into m ind-smashing technical detail. Perhaps the gist of it can be 
explained by tellin g a story about drills.

The Hole Hawg is a drill made by the Milwaukee Tool Company. If you look 
in a ty pical hardware store you may find smaller Milwaukee drills but not 
the Hole Hawg , which is too powerful and too expensive for homeowners. 
The Hole Hawg does not
 have the pistol-like design of a cheap homeowner's drill. It is a cube of 
 solid metal with a handle sticking out of one face and a chuck mounted in 
 another. Th
e cube contains a disconcertingly potent electric motor. You can hold the 
handle
 and operate the trigger with your index finger, but unless you are 
exceptionall y strong you cannot control the weight of the Hole Hawg with 
one hand; it is a t wo-hander all the way. In order to fight off the 
counter-torque of the Hole Hawg
 you use a separate handle (provided), which you screw into one side of 
 the iron cube or the other depending on whether you are using your left 
 or right hand to operate the trigger. This handle is not a sleek, 
 ergonomically designed item as it would be in a homeowner's drill. It is 
 simply a foot-long chunk of regular g
alvanized pipe, threaded on one end, with a black rubber handle on the 
other. If
 you lose it, you just go to the local plumbing supply store and buy 
another chu nk of pipe.

During the Eighties I did some construction work. One day, another worker 
leaned
 a ladder against the outside of the building that we were putting up, 
climbed u p to the second-story level, and used the Hole Hawg to drill a 
hole through the exterior wall. At some point, the drill bit caught in the 
wall. The Hole Hawg, f ollowing its one and only imperative, kept going. 
It spun the worker's body arou nd like a rag doll, causing him to knock 
his own ladder down. Fortunately he kep t his grip on the Hole Hawg, which 
remained lodged in the wall, and he simply da ngled from it and shouted 
for help until someone came along and reinstated the l adder.

I myself used a Hole Hawg to drill many holes through studs, which it did 
as a b lender chops cabbage. I also used it to cut a few six-inch-diameter 
holes throug h an old lath-and-plaster ceiling. I chucked in a new hole 
saw, went up to the s econd story, reached down between the newly 
installed floor joists, and began to
 cut through the first-floor ceiling below. Where my homeowner's drill had 
labor ed and whined to spin the huge bit around, and had stalled at the 
slightest obst ruction, the Hole Hawg rotated with the stupid consistency 
of a spinning planet.
 When the hole saw seized up, the Hole Hawg spun itself and me around, and 
crush ed one of my hands between the steel pipe handle and a joist, 
producing a few la cerations, each surrounded by a wide corona of deeply 
bruised flesh. It also ben t the hole saw itself, though not so badly that 
I couldn't use it. After a few s uch run-ins, when I got ready to use the 
Hole Hawg my heart actually began to po und with atavistic terror.

But I never blamed the Hole Hawg; I blamed myself. The Hole Hawg is 
dangerous be cause it does exactly what you tell it to. It is not bound by 
the physical limit ations that are inherent in a cheap drill, and neither 
is it limited by safety i nterlocks that might be built into a homeowner's 
product by a liability-consciou s manufacturer. The danger lies not in the 
machine itself but in the user's fail ure to envision the full 
consequences of the instructions he gives to it.

A smaller tool is dangerous too, but for a completely different reason: it 
tries
 to do what you tell it to, and fails in some way that is unpredictable 
and almo st always undesirable. But the Hole Hawg is like the genie of the 
ancient fairy tales, who carries out his master's instructions literally 
and precisely and wit h unlimited power, often with disastrous, unforeseen 
consequences.

Pre-Hole Hawg, I used to examine the drill selection in hardware stores 
with wha t I thought was a judicious eye, scorning the smaller low-end 
models and hefting
 the big expensive ones appreciatively, wishing I could afford one of them 
babie s. Now I view them all with such contempt that I do not even 
consider them to be
 real drills--merely scaled-up toys designed to exploit the 
self-delusional tend encies of soft-handed homeowners who want to believe 
that they have purchased an
 actual tool. Their plastic casings, carefully designed and 
focus-group-tested t o convey a feeling of solidity and power, seem 
disgustingly flimsy and cheap to me, and I am ashamed that I was ever 
bamboozled into buying such knicknacks.

It is not hard to imagine what the world would look like to someone who 
had been
 raised by contractors and who had never used any drill other than a Hole 
Hawg. Such a person, presented with the best and most expensive 
hardware-store drill, would not even recognize it as such. He might 
instead misidentify it as a child' s toy, or some kind of motorized 
screwdriver. If a salesperson or a deluded home owner referred to it as a 
drill, he would laugh and tell them that they were mis taken--they simply 
had their terminology wrong. His interlocutor would go away i rritated, 
and probably feeling rather defensive about his basement full of cheap , 
dangerous, flashy, colorful tools.

Unix is the Hole Hawg of operating systems, and Unix hackers, like Doug 
Barnes a nd the guy in the Dilbert cartoon and many of the other people 
who populate Sili con Valley, are like contractor's sons who grew up using 
only Hole Hawgs. They m ight use Apple/Microsoft OSes to write letters, 
play video games, or balance the ir checkbooks, but they cannot really 
bring themselves to take these operating s ystems seriously.


THE ORAL TRADITION


Unix is hard to learn. The process of learning it is one of multiple small 
epiph anies. Typically you are just on the verge of inventing some 
necessary tool or u tility when you realize that someone else has already 
invented it, and built it in, and this explains some odd file or directory 
or command that you have notice d but never really understood before.

For example there is a command (a small program, part of the OS) called 
whoami, which enables you to ask the computer who it thinks you are. On a 
Unix machine, you are always logged in under some name--possibly even your 
own! What files you
 may work with, and what software you may use, depends on your identity. 
When I started out using Linux, I was on a non-networked machine in my 
basement, with o nly one user account, and so when I became aware of the 
whoami command it struck
 me as ludicrous. But once you are logged in as one person, you can 
temporarily switch over to a pseudonym in order to access different files. 
If your machine i s on the Internet, you can log onto other computers, 
provided you have a user na me and a password. At that point the distant 
machine becomes no different in pra ctice from the one right in front of 
you. These changes in identity and location
 can easily become nested inside each other, many layers deep, even if you 
aren' t doing anything nefarious. Once you have forgotten who and where 
you are, the w hoami command is indispensible. I use it all the time.

The file systems of Unix machines all have the same general structure. On 
your f limsy operating systems, you can create directories (folders) and 
give them name s like Frodo or My Stuff and put them pretty much anywhere 
you like. But under U nix the highest level--the root--of the filesystem 
is always designated with the
 single character "/" and it always contains the same set of top-level 
directori es:

/usr /etc /var /bin /proc /boot /home /root /sbin /dev /lib /tmp


and each of these directories typically has its own distinct structure of 
subdir ectories. Note the obsessive use of abbreviations and avoidance of 
capital lette rs; this is a system invented by people to whom repetitive 
stress disorder is wh at black lung is to miners. Long names get worn down 
to three-letter nubbins, li ke stones smoothed by a river.

This is not the place to try to explain why each of the above directories 
exists , and what is contained in it. At first it all seems obscure; 
worse, it seems de liberately obscure. When I started using Linux I was 
accustomed to being able to
 create directories wherever I wanted and to give them whatever names 
struck my fancy. Under Unix you are free to do that, of course (you are 
free to do anythin g) but as you gain experience with the system you come 
to understand that the di rectories listed above were created for the best 
of reasons and that your life w ill be much easier if you follow along 
(within /home, by the way, you have prett y much unlimited freedom).

After this kind of thing has happened several hundred or thousand times, 
the hac ker understands why Unix is the way it is, and agrees that it 
wouldn't be the sa me any other way. It is this sort of acculturation that 
gives Unix hackers their
 confidence in the system, and the attitude of calm, unshakable, annoying 
superi ority captured in the Dilbert cartoon. Windows 95 and MacOS are 
products, contri ved by engineers in the service of specific companies. 
Unix, by contrast, is not
 so much a product as it is a painstakingly compiled oral history of the 
hacker subculture. It is our Gilgamesh epic.

What made old epics like Gilgamesh so powerful and so long-lived was that 
they w ere living bodies of narrative that many people knew by heart, and 
told over and
 over again--making their own personal embellishments whenever it struck 
their f ancy. The bad embellishments were shouted down, the good ones 
picked up by other s, polished, improved, and, over time, incorporated 
into the story. Likewise, Un ix is known, loved, and understood by so many 
hackers that it can be re-created from scratch whenever someone needs it. 
This is very difficult to understand for
 people who are accustomed to thinking of OSes as things that absolutely 
 have to be bought.

Many hackers have launched more or less successful re-implementations of 
the Uni x ideal. Each one brings in new embellishments. Some of them die 
out quickly, so me are merged with similar, parallel innovations created 
by different hackers at tacking the same problem, others still are 
embraced, and adopted into the epic. Thus Unix has slowly accreted around 
a simple kernel and acquired a kind of comp lexity and asymmetry about it 
that is organic, like the roots of a tree, or the branchings of a coronary 
artery. Understanding it is more like anatomy than phys ics.

For at least a year, prior to my adoption of Linux, I had been hearing 
about it.
 Credible, well-informed people kept telling me that a bunch of hackers 
had got together an implentation of Unix that could be downloaded, free of 
charge, from the Internet. For a long time I could not bring myself to 
take the notion seriou sly. It was like hearing rumors that a group of 
model rocket enthusiasts had cre ated a completely functional Saturn V by 
exchanging blueprints on the Net and ma iling valves and flanges to each 
other.

But it's true. Credit for Linux generally goes to its human namesake, one 
Linus Torvalds, a Finn who got the whole thing rolling in 1991 when he 
used some of th e GNU tools to write the beginnings of a Unix kernel that 
could run on PC-compat ible hardware. And indeed Torvalds deserves all the 
credit he has ever gotten, a nd a whole lot more. But he could not have 
made it happen by himself, any more t han Richard Stallman could have. To 
write code at all, Torvalds had to have chea p but powerful development 
tools, and these he got from Stallman's GNU project.

And he had to have cheap hardware on which to write that code. Cheap 
hardware is
 a much harder thing to arrange than cheap software; a single person 
(Stallman) can write software and put it up on the Net for free, but in 
order to make hardw are it's necessary to have a whole industrial 
infrastructure, which is not cheap
 by any stretch of the imagination. Really the only way to make hardware 
cheap i s to punch out an incredible number of copies of it, so that the 
unit cost event ually drops. For reasons already explained, Apple had no 
desire to see the cost of hardware drop. The only reason Torvalds had 
cheap hardware was Microsoft.

Microsoft refused to go into the hardware business, insisted on making its 
softw are run on hardware that anyone could build, and thereby created the 
market cond itions that allowed hardware prices to plummet. In trying to 
understand the Linu x phenomenon, then, we have to look not to a single 
innovator but to a sort of b izarre Trinity: Linus Torvalds, Richard 
Stallman, and Bill Gates. Take away any of these three and Linux would not 
exist.


OS SHOCK


Young Americans who leave their great big homogeneous country and visit 
some oth er part of the world typically go through several stages of 
culture shock: first , dumb wide-eyed astonishment. Then a tentative 
engagement with the new country' s manners, cuisine, public transit 
systems and toilets, leading to a brief perio d of fatuous confidence that 
they are instant experts on the new country. As the
 visit wears on, homesickness begins to set in, and the traveler begins to 
appre ciate, for the first time, how much he or she took for granted at 
home. At the s ame time it begins to seem obvious that many of one's own 
cultures and tradition s are essentially arbitrary, and could have been 
different; driving on the right
 side of the road, for example. When the traveler returns home and takes 
stock o f the experience, he or she may have learned a good deal more 
about America than
 about the country they went to visit.

For the same reasons, Linux is worth trying. It is a strange country 
indeed, but
 you don't have to live there; a brief sojourn suffices to give some 
flavor of t he place and--more importantly--to lay bare everything that is 
taken for granted , and all that could have been done differently, under 
Windows or MacOS.

You can't try it unless you install it. With any other OS, installing it 
would b e a straightforward transaction: in exchange for money, some 
company would give you a CD-ROM, and you would be on your way. But a lot 
is subsumed in that kind o f transaction, and has to be gone through and 
picked apart.

We like plain dealings and straightforward transactions in America. If you 
go to
 Egypt and, say, take a taxi somewhere, you become a part of the taxi 
driver's l ife; he refuses to take your money because it would demean your 
friendship, he f ollows you around town, and weeps hot tears when you get 
in some other guy's tax i. You end up meeting his kids at some point, and 
have to devote all sort of ing enuity to finding some way to compensate 
him without insulting his honor. It is exhausting. Sometimes you just want 
a simple Manhattan-style taxi ride.

But in order to have an American-style setup, where you can just go out 
and hail
 a taxi and be on your way, there must exist a whole hidden apparatus of 
medalli ons, inspectors, commissions, and so forth--which is fine as long 
as taxis are c heap and you can always get one. When the system fails to 
work in some way, it i s mysterious and infuriating and turns otherwise 
reasonable people into conspira cy theorists. But when the Egyptian system 
breaks down, it breaks down transpare ntly. You can't get a taxi, but your 
driver's nephew will show up, on foot, to e xplain the problem and 
apologize.

Microsoft and Apple do things the Manhattan way, with vast complexity 
hidden beh ind a wall of interface. Linux does things the Egypt way, with 
vast complexity s trewn about all over the landscape. If you've just flown 
in from Manhattan, your
 first impulse will be to throw up your hands and say "For crying out 
 loud! Will you people get a grip on yourselves!?" But this does not make 
 friends in Linux-
land any better than it would in Egypt.

You can suck Linux right out of the air, as it were, by downloading the 
right fi les and putting them in the right places, but there probably are 
not more than a
 few hundred people in the world who could create a functioning Linux 
system in that way. What you really need is a distribution of Linux, which 
means a prepack aged set of files. But distributions are a separate thing 
from Linux per se.

Linux per se is not a specific set of ones and zeroes, but a 
self-organizing Net
 subculture. The end result of its collective lucubrations is a vast body 
of sou rce code, almost all written in C (the dominant computer 
programming language). "Source code" just means a computer program as 
typed in and edited by some hacke r. If it's in C, the file name will 
probably have .c or .cpp on the end of it, d epending on which dialect was 
used; if it's in some other language it will have some other suffix. 
Frequently these sorts of files can be found in a directory w ith the name 
/src which is the hacker's Hebraic abbreviation of "source."

Source files are useless to your computer, and of little interest to most 
users,
 but they are of gigantic cultural and political significance, because 
 Microsoft and Apple keep them secret while Linux makes them public. They 
 are the family j
ewels. They are the sort of thing that in Hollywood thrillers is used as a 
McGuf fin: the plutonium bomb core, the top-secret blueprints, the 
suitcase of bearer bonds, the reel of microfilm. If the source files for 
Windows or MacOS were made
 public on the Net, then those OSes would become free, like Linux--only 
not as g ood, because no one would be around to fix bugs and answer 
questions. Linux is " open source" software meaning, simply, that anyone 
can get copies of its source code files.

Your computer doesn't want source code any more than you do; it wants 
object cod e. Object code files typically have the suffix .o and are 
unreadable all but a f ew, highly strange humans, because they consist of 
ones and zeroes. Accordingly,
 this sort of file commonly shows up in a directory with the name /bin, 
for "bin ary."

Source files are simply ASCII text files. ASCII denotes a particular way 
of enco ding letters into bit patterns. In an ASCII file, each character 
has eight bits all to itself. This creates a potential "alphabet" of 256 
distinct characters, i n that eight binary digits can form that many 
unique patterns. In practice, of c ourse, we tend to limit ourselves to 
the familiar letters and digits. The bit-pa tterns used to represent those 
letters and digits are the same ones that were ph ysically punched into 
the paper tape by my high school teletype, which in turn w ere the same 
one used by the telegraph industry for decades previously. ASCII te xt 
files, in other words, are telegrams, and as such they have no 
typographical frills. But for the same reason they are eternal, because 
the code never changes , and universal, because every text editing and 
word processing software ever wr itten knows about this code.

Therefore just about any software can be used to create, edit, and read 
source c ode files. Object code files, then, are created from these source 
files by a pie ce of software called a compiler, and forged into a working 
application by anoth er piece of software called a linker.

The triad of editor, compiler, and linker, taken together, form the core 
of a so ftware development system. Now, it is possible to spend a lot of 
money on shrink -wrapped development systems with lovely graphical user 
interfaces and various e rgonomic enhancements. In some cases it might 
even be a good and reasonable way to spend money. But on this side of the 
road, as it were, the very best software
 is usually the free stuff. Editor, compiler and linker are to hackers 
what poni es, stirrups, and archery sets were to the Mongols. Hackers live 
in the saddle, and hack on their own tools even while they are using them 
to create new applica tions. It is quite inconceivable that superior 
hacking tools could have been cre ated from a blank sheet of paper by 
product engineers. Even if they are the brig htest engineers in the world 
they are simply outnumbered.

In the GNU/Linux world there are two major text editing programs: the 
minimalist
 vi (known in some implementations as elvis) and the maximalist emacs. I 
use ema cs, which might be thought of as a thermonuclear word processor. 
It was created by Richard Stallman; enough said. It is written in Lisp, 
which is the only compu ter language that is beautiful. It is colossal, 
and yet it only edits straight A SCII text files, which is to say, no 
fonts, no boldface, no underlining. In othe r words, the engineer-hours 
that, in the case of Microsoft Word, were devoted to
 features like mail merge, and the ability to embed feature-length motion 
pictur es in corporate memoranda, were, in the case of emacs, focused with 
maniacal int ensity on the deceptively simple-seeming problem of editing 
text. If you are a p rofessional writer--i.e., if someone else is getting 
paid to worry about how you r words are formatted and printed--emacs 
outshines all other editing software in
 approximately the same way that the noonday sun does the stars. It is not 
just bigger and brighter; it simply makes everything else vanish. For page 
layout and
 printing you can use TeX: a vast corpus of typesetting lore written in C 
and al so available on the Net for free.

I could say a lot about emacs and TeX, but right now I am trying to tell a 
story
 about how to actually install Linux on your machine. The hard-core 
survivalist approach would be to download an editor like emacs, and the 
GNU Tools--the compi ler and linker--which are polished and excellent to 
the same degree as emacs. Eq uipped with these, one would be able to start 
downloading ASCII source code file s (/src) and compiling them into binary 
object code files (/bin) that would run on the machine. But in order to 
even arrive at this point--to get emacs running,
 for example--you have to have Linux actually up and running on your 
machine. An d even a minimal Linux operating system requires thousands of 
binary files all a cting in concert, and arranged and linked together just 
so.

Several entities have therefore taken it upon themselves to create 
"distribution s" of Linux. If I may extend the Egypt analogy slightly, 
these entities are a bi t like tour guides who meet you at the airport, 
who speak your language, and who
 help guide you through the initial culture shock. If you are an Egyptian, 
of co urse, you see it the other way; tour guides exist to keep brutish 
outlanders fro m traipsing through your mosques and asking you the same 
questions over and over
 and over again.

Some of these tour guides are commercial organizations, such as Red Hat 
Software , which makes a Linux distribution called Red Hat that has a 
relatively commerci al sheen to it. In most cases you put a Red Hat CD-ROM 
into your PC and reboot a nd it handles the rest. Just as a tour guide in 
Egypt will expect some sort of c ompensation for his services, commercial 
distributions need to be paid for. In m ost cases they cost almost nothing 
and are well worth it.

I use a distribution called Debian (the word is a contraction of "Deborah" 
and " Ian") which is non-commercial. It is organized (or perhaps I should 
say "it has organized itself") along the same lines as Linux in general, 
which is to say tha t it consists of volunteers who collaborate over the 
Net, each responsible for l ooking after a different chunk of the system. 
These people have broken Linux dow n into a number of packages, which are 
compressed files that can be downloaded t o an already functioning Debian 
Linux system, then opened up and unpacked using a free installer 
application. Of course, as such, Debian has no commercial arm-- no 
distribution mechanism. You can download all Debian packages over the Net, 
bu t most people will want to have them on a CD-ROM. Several different 
companies ha ve taken it upon themselves to decoct all of the current 
Debian packages onto CD -ROMs and then sell them. I buy mine from Linux 
Systems Labs. The cost for a thr ee-disc set, containing Debian in its 
entirety, is less than three dollars. But (and this is an important 
distinction) not a single penny of that three dollars is going to any of 
the coders who created Linux, nor to the Debian packagers. It
 goes to Linux Systems Labs and it pays, not for the software, or the 
packages, but for the cost of stamping out the CD-ROMs.

Every Linux distribution embodies some more or less clever hack for 
circumventin g the normal boot process and causing your computer, when it 
is turned on, to or ganize itself, not as a PC running Windows, but as a 
"host" running Unix. This i s slightly alarming the first time you see it, 
but completely harmless. When a P C boots up, it goes through a little 
self-test routine, taking an inventory of a vailable disks and memory, and 
then begins looking around for a disk to boot up from. In any normal 
Windows computer that disk will be a hard drive. But if you have your 
system configured right, it will look first for a floppy or CD-ROM dis k, 
and boot from that if one is available.

Linux exploits this chink in the defenses. Your computer notices a 
bootable disk
 in the floppy or CD-ROM drive, loads in some object code from that disk, 
and bl indly begins to execute it. But this is not Microsoft or Apple 
code, this is Lin ux code, and so at this point your computer begins to 
behave very differently fr om what you are accustomed to. Cryptic messages 
began to scroll up the screen. I f you had booted a commercial OS, you 
would, at this point, be seeing a "Welcome
 to MacOS" cartoon, or a screen filled with clouds in a blue sky, and a 
Windows logo. But under Linux you get a long telegram printed in stark 
white letters on a black screen. There is no "welcome!" message. Most of 
the telegram has the sem i-inscrutable menace of graffiti tags.

Dec 14 15:04:15 theRev syslogd 1.3-3#17: restart. Dec 14 15:04:15 theRev 
kernel:
 klogd 1.3-3, log source = /proc/kmsg started. Dec 14 15:04:15 theRev 
kernel: Lo aded 3535 symbols from /System.map. Dec 14 15:04:15 theRev 
kernel: Symbols match
 kernel version 2.0.30. Dec 14 15:04:15 theRev kernel: No module symbols 
 loaded. Dec 14 15:04:15 theRev kernel: Intel MultiProcessor Specification 
 v1.4 Dec 14 1
5:04:15 theRev kernel: Virtual Wire compatibility mode. Dec 14 15:04:15 
theRev k ernel: OEM ID: INTEL Product ID: 440FX APIC at: 0xFEE00000 Dec 14 
15:04:15 theRe v kernel: Processor #0 Pentium(tm) Pro APIC version 17 Dec 
14 15:04:15 theRev ke rnel: Processor #1 Pentium(tm) Pro APIC version 17 
Dec 14 15:04:15 theRev kernel
: I/O APIC #2 Version 17 at 0xFEC00000. Dec 14 15:04:15 theRev kernel: 
: Processor
s: 2 Dec 14 15:04:15 theRev kernel: Console: 16 point font, 400 scans Dec 
14 15: 04:15 theRev kernel: Console: colour VGA+ 80x25, 1 virtual console 
(max 63) Dec 14 15:04:15 theRev kernel: pcibios_init : BIOS32 Service 
Directory structure at 0x000fdb70 Dec 14 15:04:15 theRev kernel: 
pcibios_init : BIOS32 Service Director y entry at 0xfdb80 Dec 14 15:04:15 
theRev kernel: pcibios_init : PCI BIOS revisi on 2.10 entry at 0xfdba1 Dec 
14 15:04:15 theRev kernel: Probing PCI hardware. De c 14 15:04:15 theRev 
kernel: Warning : Unknown PCI device (10b7:9001). Please re ad 
include/linux/pci.h Dec 14 15:04:15 theRev kernel: Calibrating delay 
loop.. o k - 179.40 BogoMIPS Dec 14 15:04:15 theRev kernel: Memory: 
64268k/66556k availab le (700k kernel code, 384k reserved, 1204k data) Dec 
14 15:04:15 theRev kernel: Swansea University Computer Society NET3.035 
for Linux 2.0 Dec 14 15:04:15 theRe v kernel: NET3: Unix domain sockets 
0.13 for Linux NET3.035. Dec 14 15:04:15 the Rev kernel: Swansea 
University Computer Society TCP/IP for NET3.034 Dec 14 15:04
:15 theRev kernel: IP Protocols: ICMP, UDP, TCP Dec 14 15:04:15 theRev 
:kernel: C
hecking 386/387 coupling... Ok, fpu using exception 16 error reporting. 
Dec 14 1 5:04:15 theRev kernel: Checking 'hlt' instruction... Ok. Dec 14 
15:04:15 theRev
kernel: Linux version 2.0.30 (root@theRev) (gcc version 2.7.2.1) #15 Fri 
kernel: Mar 27
16:37:24 PST 1998 Dec 14 15:04:15 theRev kernel: Booting processor 1 stack 
00002 000: Calibrating delay loop.. ok - 179.40 BogoMIPS Dec 14 15:04:15 
theRev kernel
: Total of 2 processors activated (358.81 BogoMIPS). Dec 14 15:04:15 
: theRev kern
el: Serial driver version 4.13 with no serial options enabled Dec 14 
15:04:15 th eRev kernel: tty00 at 0x03f8 (irq = 4) is a 16550A Dec 14 
15:04:15 theRev kernel
: tty01 at 0x02f8 (irq = 3) is a 16550A Dec 14 15:04:15 theRev kernel: lp1 
: at 0x
0378, (polling) Dec 14 15:04:15 theRev kernel: PS/2 auxiliary pointing 
device de tected -- driver installed. Dec 14 15:04:15 theRev kernel: Real 
Time Clock Drive r v1.07 Dec 14 15:04:15 theRev kernel: loop: registered 
device at major 7 Dec 14
 15:04:15 theRev kernel: ide: i82371 PIIX (Triton) on PCI bus 0 function 
57 Dec 14 15:04:15 theRev kernel: ide0: BM-DMA at 0xffa0-0xffa7 Dec 14 
15:04:15 theRev
kernel: ide1: BM-DMA at 0xffa8-0xffaf Dec 14 15:04:15 theRev kernel: hda: 
kernel: Conner
 Peripherals 1275MB - CFS1275A, 1219MB w/64kB Cache, LBA, CHS=619/64/63 
Dec 14 1 5:04:15 theRev kernel: hdb: Maxtor 84320A5, 4119MB w/256kB Cache, 
LBA, CHS=8928/ 15/63, DMA Dec 14 15:04:15 theRev kernel: hdc: , ATAPI 
CDROM drive Dec 15 11:58:
06 theRev kernel: ide0 at 0x1f0-0x1f7,0x3f6 on irq 14 Dec 15 11:58:06 
   theRev ker nel: ide1 at 0x170-0x177,0x376 on irq 15 Dec 15 11:58:06 
   theRev kernel: Floppy d rive(s): fd0 is 1.44M Dec 15 11:58:06 theRev 
   kernel: Started kswapd v 1.4.2.2 De c 15 11:58:06 theRev kernel: FDC 0 
   is a National Semiconductor PC87306 Dec 15 11
:58:06 theRev kernel: md driver 0.35 MAX_MD_DEV=4, MAX_REAL=8 Dec 15 
:11:58:06 th
eRev kernel: PPP: version 2.2.0 (dynamic channel allocation) Dec 15 
11:58:06 the Rev kernel: TCP compression code copyright 1989 Regents of 
the University of Cal ifornia Dec 15 11:58:06 theRev kernel: PPP Dynamic 
channel allocation code copyr ight 1995 Caldera, Inc. Dec 15 11:58:06 
theRev kernel: PPP line discipline regis tered. Dec 15 11:58:06 theRev 
kernel: SLIP: version 0.8.4-NET3.019-NEWTTY (dynam ic channels, max=256). 
Dec 15 11:58:06 theRev kernel: eth0: 3Com 3c900 Boomerang
 10Mbps/Combo at 0xef00, 00:60:08:a4:3c:db, IRQ 10 Dec 15 11:58:06 theRev 
 kernel
: 8K word-wide RAM 3:5 Rx:Tx split, 10base2 interface. Dec 15 11:58:06 
: theRev ke
rnel: Enabling bus-master transmits and whole-frame receives. Dec 15 
11:58:06 th eRev kernel: 3c59x.c:v0.49 1/2/98 Donald Becker 
http://cesdis.gsfc.nasa.gov/linu x/drivers/vortex.html Dec 15 11:58:06 
theRev kernel: Partition check: Dec 15 11: 58:06 theRev kernel: hda: hda1 
hda2 hda3 Dec 15 11:58:06 theRev kernel: hdb: hdb 1 hdb2 Dec 15 11:58:06 
theRev kernel: VFS: Mounted root (ext2 filesystem) readon ly. Dec 15 
11:58:06 theRev kernel: Adding Swap: 16124k swap-space (priority -1) Dec 
15 11:58:06 theRev kernel: EXT2-fs warning: maximal mount count reached, 
run ning e2fsck is recommended Dec 15 11:58:06 theRev kernel: hdc: media 
changed Dec
 15 11:58:06 theRev kernel: ISO9660 Extensions: RRIP_1991A Dec 15 11:58:07 
theRe v syslogd 1.3-3#17: restart. Dec 15 11:58:09 theRev diald[87]: 
Unable to open op tions file /etc/diald/diald.options: No such file or 
directory Dec 15 11:58:09 t heRev diald[87]: No device specified. You must 
have at least one device! Dec 15 11:58:09 theRev diald[87]: You must 
define a connector script (option 'connect') . Dec 15 11:58:09 theRev 
diald[87]: You must define the remote ip address. Dec 1 5 11:58:09 theRev 
diald[87]: You must define the local ip address. Dec 15 11:58:
09 theRev diald[87]: Terminating due to damaged reconfigure.

The only parts of this that are readable, for normal people, are the error 
messa ges and warnings. And yet it's noteworthy that Linux doesn't stop, 
or crash, whe n it encounters an error; it spits out a pithy complaint, 
gives up on whatever p rocesses were damaged, and keeps on rolling. This 
was decidedly not true of the early versions of Apple and Microsoft OSes, 
for the simple reason that an OS tha t is not capable of walking and 
chewing gum at the same time cannot possibly rec over from errors. Looking 
for, and dealing with, errors requires a separate proc ess running in 
parallel with the one that has erred. A kind of superego, if you will, 
that keeps an eye on all of the others, and jumps in when one goes astray.
 Now that MacOS and Windows can do more than one thing at a time they are 
much b etter at dealing with errors than they used to be, but they are not 
even close t o Linux or other Unices in this respect; and their greater 
complexity has made t hem vulnerable to new types of errors.

FALLIBILITY, ATONEMENT, REDEMPTION, TRUST, AND OTHER ARCANE TECHNICAL 
CONCEPTS

Linux is not capable of having any centrally organized policies dictating 
how to
 write error messages and documentation, and so each programmer writes his 
own. Usually they are in English even though tons of Linux programmers are 
Europeans.
 Frequently they are funny. Always they are honest. If something bad has 
happene d because the software simply isn't finished yet, or because the 
user screwed so mething up, this will be stated forthrightly. The command 
line interface makes i t easy for programs to dribble out little comments, 
warnings, and messages here and there. Even if the application is 
imploding like a damaged submarine, it can
 still usually eke out a little S.O.S. message. Sometimes when you finish 
workin g with a program and shut it down, you find that it has left behind 
a series of mild warnings and low-grade error messages in the command-line 
interface window from which you launched it. As if the software were 
chatting to you about how it
 was doing the whole time you were working with it.

Documentation, under Linux, comes in the form of man (short for manual) 
pages. Y ou can access these either through a GUI (xman) or from the 
command line (man). Here is a sample from the man page for a program 
called rsh:

"Stop signals stop the local rsh process only; this is arguably wrong, but 
curre ntly hard to fix for reasons too complicated to explain here."

The man pages contain a lot of such material, which reads like the terse 
mutteri ngs of pilots wrestling with the controls of damaged airplanes. 
The general feel
 is of a thousand monumental but obscure struggles seen in the stop-action 
 light of a strobe. Each programmer is dealing with his own obstacles and 
 bugs; he is
too busy fixing them, and improving the software, to explain things at 
great len gth or to maintain elaborate pretensions.

In practice you hardly ever encounter a serious bug while running Linux. 
When yo u do, it is almost always with commercial software (several 
vendors sell softwar e that runs under Linux). The operating system and 
its fundamental utility progr ams are too important to contain serious 
bugs. I have been running Linux every d ay since late 1995 and have seen 
many application programs go down in flames, bu t I have never seen the 
operating system crash. Never. Not once. There are quite
 a few Linux systems that have been running continuously and working hard 
for mo nths or years without needing to be rebooted.

Commercial OSes have to adopt the same official stance towards errors as 
Communi st countries had towards poverty. For doctrinal reasons it was not 
possible to a dmit that poverty was a serious problem in Communist 
countries, because the whol e point of Communism was to eradicate poverty. 
Likewise, commercial OS companies
 like Apple and Microsoft can't go around admitting that their software 
 has bugs and that it crashes all the time, any more than Disney can issue 
 press releases stating that Mickey Mouse is an actor in a suit.

This is a problem, because errors do exist and bugs do happen. Every few 
months Bill Gates tries to demo a new Microsoft product in front of a 
large audience on ly to have it blow up in his face. Commercial OS 
vendors, as a direct consequenc e of being commercial, are forced to adopt 
the grossly disingenuous position tha t bugs are rare aberrations, usually 
someone else's fault, and therefore not rea lly worth talking about in any 
detail. This posture, which everyone knows to be absurd, is not limited to 
press releases and ad campaigns. It informs the whole way these companies 
do business and relate to their customers. If the documentat ion were 
properly written, it would mention bugs, errors, and crashes on every s 
ingle page. If the on-line help systems that come with these OSes 
reflected the experiences and concerns of their users, they would largely 
be devoted to instru ctions on how to cope with crashes and errors.

But this does not happen. Joint stock corporations are wonderful 
inventions that
 have given us many excellent goods and services. They are good at many 
things. Admitting failure is not one of them. Hell, they can't even admit 
minor shortcom ings.

Of course, this behavior is not as pathological in a corporation as it 
would be in a human being. Most people, nowadays, understand that 
corporate press release s are issued for the benefit of the corporation's 
shareholders and not for the e nlightenment of the public. Sometimes the 
results of this institutional dishones ty can be dreadful, as with tobacco 
and asbestos. In the case of commercial OS v endors it is nothing of the 
kind, of course; it is merely annoying.

Some might argue that consumer annoyance, over time, builds up into a kind 
of ha rdened plaque that can conceal serious decay, and that honesty might 
therefore b e the best policy in the long run; the jury is still out on 
this in the operatin g system market. The business is expanding fast 
enough that it's still much bett er to have billions of chronically 
annoyed customers than millions of happy ones .

Most system administrators I know who work with Windows NT all the time 
agree th at when it hits a snag, it has to be re-booted, and when it gets 
seriously messe d up, the only way to fix it is to re-install the 
operating system from scratch.
 Or at least this is the only way that they know of to fix it, which 
amounts to the same thing. It is quite possible that the engineers at 
Microsoft have all so rts of insider knowledge on how to fix the system 
when it goes awry, but if they
 do, they do not seem to be getting the message out to any of the actual 
system administrators I know.

Because Linux is not commercial--because it is, in fact, free, as well as 
rather
 difficult to obtain, install, and operate--it does not have to maintain 
any pre tensions as to its reliability. Consequently, it is much more 
reliable. When som ething goes wrong with Linux, the error is noticed and 
loudly discussed right aw ay. Anyone with the requisite technical 
knowledge can go straight to the source code and point out the source of 
the error, which is then rapidly fixed by which ever hacker has carved out 
responsibility for that particular program.

As far as I know, Debian is the only Linux distribution that has its own 
constit ution (http://www.debian.org/devel/constitution), but what really 
sold me on it was its phenomenal bug database 
(http://www.debian.org/Bugs), which is a sort of
 interactive Doomsday Book of error, fallibility, and redemption. It is 
simplici ty itself. When had a problem with Debian in early January of 
1997, I sent in a message describing the problem to 
submit@bugs.debian.org. My problem was promptl y assigned a bug report 
number (#6518) and a severity level (the available choic es being 
critical, grave, important, normal, fixed, and wishlist) and forwarded to 
mailing lists where Debian people hang out. Within twenty-four hours I had 
re ceived five e-mails telling me how to fix the problem: two from North 
America, t wo from Europe, and one from Australia. All of these e-mails 
gave me the same su ggestion, which worked, and made my problem go away. 
But at the same time, a tra nscript of this exchange was posted to 
Debian's bug database, so that if other u sers had the same problem later, 
they would be able to search through and find t he solution without having 
to enter a new, redundant bug report.

Contrast this with the experience that I had when I tried to install 
Windows NT 4.0 on the very same machine about ten months later, in late 
1997. The installat ion program simply stopped in the middle with no error 
messages. I went to the M icrosoft Support website and tried to perform a 
search for existing help documen ts that would address my problem. The 
search engine was completely nonfunctional ; it did nothing at all. It did 
not even give me a message telling me that it wa s not working.

Eventually I decided that my motherboard must be at fault; it was of a 
slightly unusual make and model, and NT did not support as many different 
motherboards as
 Linux. I am always looking for excuses, no matter how feeble, to buy new 
hardwa re, so I bought a new motherboard that was Windows NT 
logo-compatible, meaning t hat the Windows NT logo was printed right on 
the box. I installed this into my c omputer and got Linux running right 
away, then attempted to install Windows NT a gain. Again, the installation 
died without any error message or explanation. By this time a couple of 
weeks had gone by and I thought that perhaps the search en gine on the 
Microsoft Support website might be up and running. I gave that a try
 but it still didn't work.

So I created a new Microsoft support account, then logged on to submit the 
incid ent. I supplied my product ID number when asked, and then began to 
follow the in structions on a series of help screens. In other words, I 
was submitting a bug r eport just as with the Debian bug tracking system. 
It's just that the interface was slicker--I was typing my complaint into 
little text-editing boxes on Web for ms, doing it all through the GUI, 
whereas with Debian you send in an e-mail tele gram. I knew that when I 
was finished submitting the bug report, it would become
 proprietary Microsoft information, and other users wouldn't be able to 
see it. Many Linux users would refuse to participate in such a scheme on 
ethical grounds , but I was willing to give it a shot as an experiment. In 
the end, though I was
 never able to submit my bug report, because the series of linked web 
 pages that I was filling out eventually led me to a completely blank 
 page: a dead end.

So I went back and clicked on the buttons for "phone support" and 
eventually was
 given a Microsoft telephone number. When I dialed this number I got a 
 series of piercing beeps and a recorded message from the phone company 
 saying "We're sorr
y, your call cannot be completed as dialed."

I tried the search page again--it was still completely nonfunctional. Then 
I tri ed PPI (Pay Per Incident) again. This led me through another series 
of Web pages
 until I dead-ended at one reading: "Notice-there is no Web page matching 
your r equest."

I tried it again, and eventually got to a Pay Per Incident screen reading: 
"OUT OF INCIDENTS. There are no unused incidents left in your account. If 
you would l ike to purchase a support incident, click OK-you will then be 
able to prepay for
 an incident...." The cost per incident was $95.

The experiment was beginning to seem rather expensive, so I gave up on the 
PPI a pproach and decided to have a go at the FAQs posted on Microsoft's 
website. None
 of the available FAQs had anything to do with my problem except for one 
entitle d "I am having some problems installing NT" which appeared to have 
been written by flacks, not engineers.

So I gave up and still, to this day, have never gotten Windows NT 
installed on t hat particular machine. For me, the path of least 
resistance was simply to use D ebian Linux.

In the world of open source software, bug reports are useful information. 
Making
 them public is a service to other users, and improves the OS. Making them 
publi c systematically is so important that highly intelligent people 
voluntarily put time and money into running bug databases. In the 
commercial OS world, however, reporting a bug is a privilege that you have 
to pay lots of money for. But if yo u pay for it, it follows that the bug 
report must be kept confidential--otherwis e anyone could get the benefit 
of your ninety-five bucks! And yet nothing preven ts NT users from setting 
up their own public bug database.

This is, in other words, another feature of the OS market that simply 
makes no s ense unless you view it in the context of culture. What 
Microsoft is selling thr ough Pay Per Incident isn't technical support so 
much as the continued illusion that its customers are engaging in some 
kind of rational business transaction. I t is a sort of routine 
maintenance fee for the upkeep of the fantasy. If people really wanted a 
solid OS they would use Linux, and if they really wanted tech su pport 
they would find a way to get it; Microsoft's customers want something else