"In the Beginning
                            was the Command Line"
                             By Neal Stephenson
                                    (1999)

     About twenty years ago Jobs and Wozniak, the founders of Apple, came
     up with the very strange idea of selling information processing
     machines for use in the home. The business took off, and its
     founders made a lot of money and received the credit they deserved
     for being daring visionaries. But around the same time, Bill Gates
     and Paul Allen came up with an idea even stranger and more
     fantastical: selling computer operating systems. This was much
     weirder than the idea of Jobs and Wozniak. A computer at least had
     some sort of physical reality to it. It came in a box, you could
     open it up and plug it in and watch lights blink. An operating
     system had no tangible incarnation at all. It arrived on a disk, of
     course, but the disk was, in effect, nothing more than the box that
     the OS came in. The product itself was a very long string of ones
     and zeroes that, when properly installed and coddled, gave you the
     ability to manipulate other very long strings of ones and zeroes.
     Even those few who actually understood what a computer operating
     system was were apt to think of it as a fantastically arcane
     engineering prodigy, like a breeder reactor or a U-2 spy plane, and
     not something that could ever be (in the parlance of high-tech)
     "productized."

     Yet now the company that Gates and Allen founded is selling
     operating systems like Gillette sells razor blades. New releases of
     operating systems are launched as if they were Hollywood
     blockbusters, with celebrity endorsements, talk show appearances,
     and world tours. The market for them is vast enough that people
     worry about whether it has been monopolized by one company. Even the
     least technically-minded people in our society now have at least a
     hazy idea of what operating systems do; what is more, they have
     strong opinions about their relative merits. It is commonly
     understood, even by technically unsophisticated computer users, that
     if you have a piece of software that works on your Macintosh, and
     you move it over onto a Windows machine, it will not run. That this
     would, in fact, be a laughable and idiotic mistake, like nailing
     horseshoes to the tires of a Buick.

     A person who went into a coma before Microsoft was founded, and woke
     up now, could pick up this morning's New York Times and understand
     everything in it--almost:
     * Item: the richest man in the world made his fortune from-what?
       Railways? Shipping? Oil? No, operating systems.
     * Item: the Department of Justice is tackling Microsoft's supposed OS
       monopoly with legal tools that were invented to restrain the power
       of Nineteenth-Century robber barons.
     * Item: a woman friend of mine recently told me that she'd broken off
       a (hitherto) stimulating exchange of e-mail with a young man. At
       first he had seemed like such an intelligent and interesting guy,
       she said, but then "he started going all PC-versus-Mac on me."

     What the hell is going on here? And does the operating system
     business have a future, or only a past? Here is my view, which is
     entirely subjective; but since I have spent a fair amount of time
     not only using, but programming, Macintoshes, Windows machines,
     Linux boxes and the BeOS, perhaps it is not so ill-informed as to be
     completely worthless. This is a subjective essay, more review than
     research paper, and so it might seem unfair or biased compared to
     the technical reviews you can find in PC magazines. But ever since
     the Mac came out, our operating systems have been based on
     metaphors, and anything with metaphors in it is fair game as far as
     I'm concerned.

     MGBs, TANKS, AND BATMOBILES

     Around the time that Jobs, Wozniak, Gates, and Allen were dreaming
     up these unlikely schemes, I was a teenager living in Ames, Iowa.
     One of my friends' dads had an old MGB sports car rusting away in
     his garage. Sometimes he would actually manage to get it running and
     then he would take us for a spin around the block, with a memorable
     look of wild youthful exhilaration on his face; to his worried
     passengers, he was a madman, stalling and backfiring around Ames,
     Iowa and eating the dust of rusty Gremlins and Pintos, but in his
     own mind he was Dustin Hoffman tooling across the Bay Bridge with
     the wind in his hair.

     In retrospect, this was telling me two things about people's
     relationship to technology. One was that romance and image go a long
     way towards shaping their opinions. If you doubt it (and if you have
     a lot of spare time on your hands) just ask anyone who owns a
     Macintosh and who, on those grounds, imagines him- or herself to be
     a member of an oppressed minority group.

     The other, somewhat subtler point, was that interface is very
     important. Sure, the MGB was a lousy car in almost every way that
     counted: balky, unreliable, underpowered. But it was fun to drive.
     It was responsive. Every pebble on the road was felt in the bones,
     every nuance in the pavement transmitted instantly to the driver's
     hands. He could listen to the engine and tell what was wrong with
     it. The steering responded immediately to commands from his hands.
     To us passengers it was a pointless exercise in going nowhere--about
     as interesting as peering over someone's shoulder while he punches
     numbers into a spreadsheet. But to the driver it was an experience.
     For a short time he was extending his body and his senses into a
     larger realm, and doing things that he couldn't do unassisted.

     The analogy between cars and operating systems is not half bad, and
     so let me run with it for a moment, as a way of giving an executive
     summary of our situation today.

     Imagine a crossroads where four competing auto dealerships are
     situated. One of them (Microsoft) is much, much bigger than the
     others. It started out years ago selling three-speed bicycles
     (MS-DOS); these were not perfect, but they worked, and when they
     broke you could easily fix them.

     There was a competing bicycle dealership next door (Apple) that one
     day began selling motorized vehicles--expensive but attractively
     styled cars with their innards hermetically sealed, so that how they
     worked was something of a mystery.

     The big dealership responded by rushing a moped upgrade kit (the
     original Windows) onto the market. This was a Rube Goldberg
     contraption that, when bolted onto a three-speed bicycle, enabled it
     to keep up, just barely, with Apple-cars. The users had to wear
     goggles and were always picking bugs out of their teeth while Apple
     owners sped along in hermetically sealed comfort, sneering out the
     windows. But the Micro-mopeds were cheap, and easy to fix compared
     with the Apple-cars, and their market share waxed.

     Eventually the big dealership came out with a full-fledged car: a
     colossal station wagon (Windows 95). It had all the aesthetic appeal
     of a Soviet worker housing block, it leaked oil and blew gaskets,
     and it was an enormous success. A little later, they also came out
     with a hulking off-road vehicle intended for industrial users
     (Windows NT) which was no more beautiful than the station wagon, and
     only a little more reliable.

     Since then there has been a lot of noise and shouting, but little
     has changed. The smaller dealership continues to sell sleek
     Euro-styled sedans and to spend a lot of money on advertising
     campaigns. They have had GOING OUT OF BUSINESS! signs taped up in
     their windows for so long that they have gotten all yellow and
     curly. The big one keeps making bigger and bigger station wagons and
     ORVs.

     On the other side of the road are two competitors that have come
     along more recently.

     One of them (Be, Inc.) is selling fully operational Batmobiles (the
     BeOS). They are more beautiful and stylish even than the
     Euro-sedans, better designed, more technologically advanced, and at
     least as reliable as anything else on the market--and yet cheaper
     than the others.

     With one exception, that is: Linux, which is right next door, and
     which is not a business at all. It's a bunch of RVs, yurts, tepees,
     and geodesic domes set up in a field and organized by consensus. The
     people who live there are making tanks. These are not old-fashioned,
     cast-iron Soviet tanks; these are more like the M1 tanks of the U.S.
     Army, made of space-age materials and jammed with sophisticated
     technology from one end to the other. But they are better than Army
     tanks. They've been modified in such a way that they never, ever
     break down, are light and maneuverable enough to use on ordinary
     streets, and use no more fuel than a subcompact car. These tanks are
     being cranked out, on the spot, at a terrific pace, and a vast
     number of them are lined up along the edge of the road with keys in
     the ignition. Anyone who wants can simply climb into one and drive
     it away for free.

     Customers come to this crossroads in throngs, day and night. Ninety
     percent of them go straight to the biggest dealership and buy
     station wagons or off-road vehicles. They do not even look at the
     other dealerships.

     Of the remaining ten percent, most go and buy a sleek Euro-sedan,
     pausing only to turn up their noses at the philistines going to buy
     the station wagons and ORVs. If they even notice the people on the
     opposite side of the road, selling the cheaper, technically superior
     vehicles, these customers deride them cranks and half-wits.

     The Batmobile outlet sells a few vehicles to the occasional car nut
     who wants a second vehicle to go with his station wagon, but seems
     to accept, at least for now, that it's a fringe player.

     The group giving away the free tanks only stays alive because it is
     staffed by volunteers, who are lined up at the edge of the street
     with bullhorns, trying to draw customers' attention to this
     incredible situation. A typical conversation goes something like
     this:

     Hacker with bullhorn: "Save your money! Accept one of our free
     tanks! It is invulnerable, and can drive across rocks and swamps at
     ninety miles an hour while getting a hundred miles to the gallon!"

     Prospective station wagon buyer: "I know what you say is
     true...but...er...I don't know how to maintain a tank!"

     Bullhorn: "You don't know how to maintain a station wagon either!"

     Buyer: "But this dealership has mechanics on staff. If something
     goes wrong with my station wagon, I can take a day off work, bring
     it here, and pay them to work on it while I sit in the waiting room
     for hours, listening to elevator music."

     Bullhorn: "But if you accept one of our free tanks we will send
     volunteers to your house to fix it for free while you sleep!"

     Buyer: "Stay away from my house, you freak!"

     Bullhorn: "But..."

     Buyer: "Can't you see that everyone is buying station wagons?"

     BIT-FLINGER

     The connection between cars, and ways of interacting with computers,
     wouldn't have occurred to me at the time I was being taken for rides
     in that MGB. I had signed up to take a computer programming class at
     Ames High School. After a few introductory lectures, we students
     were granted admission into a tiny room containing a teletype, a
     telephone, and an old-fashioned modem consisting of a metal box with
     a pair of rubber cups on the top (note: many readers, making their
     way through that last sentence, probably felt an initial pang of
     dread that this essay was about to turn into a tedious, codgerly
     reminiscence about how tough we had it back in the old days; rest
     assured that I am actually positioning my pieces on the chessboard,
     as it were, in preparation to make a point about truly hip and
     up-to-the minute topics like Open Source Software). The teletype was
     exactly the same sort of machine that had been used, for decades, to
     send and receive telegrams. It was basically a loud typewriter that
     could only produce UPPERCASE LETTERS. Mounted to one side of it was
     a smaller machine with a long reel of paper tape on it, and a clear
     plastic hopper underneath.

     In order to connect this device (which was not a computer at all) to
     the Iowa State University mainframe across town, you would pick up
     the phone, dial the computer's number, listen for strange noises,
     and then slam the handset down into the rubber cups. If your aim was
     true, one would wrap its neoprene lips around the earpiece and the
     other around the mouthpiece, consummating a kind of informational
     soixante-neuf. The teletype would shudder as it was possessed by the
     spirit of the distant mainframe, and begin to hammer out cryptic
     messages.

     Since computer time was a scarce resource, we used a sort of batch
     processing technique. Before dialing the phone, we would turn on the
     tape puncher (a subsidiary machine bolted to the side of the
     teletype) and type in our programs. Each time we depressed a key,
     the teletype would bash out a letter on the paper in front of us, so
     we could read what we'd typed; but at the same time it would convert
     the letter into a set of eight binary digits, or bits, and punch a
     corresponding pattern of holes across the width of a paper tape. The
     tiny disks of paper knocked out of the tape would flutter down into
     the clear plastic hopper, which would slowly fill up what can only
     be described as actual bits. On the last day of the school year, the
     smartest kid in the class (not me) jumped out from behind his desk
     and flung several quarts of these bits over the head of our teacher,
     like confetti, as a sort of semi-affectionate practical joke. The
     image of this man sitting there, gripped in the opening stages of an
     atavistic fight-or-flight reaction, with millions of bits
     (megabytes) sifting down out of his hair and into his nostrils and
     mouth, his face gradually turning purple as he built up to an
     explosion, is the single most memorable scene from my formal
     education.

     Anyway, it will have been obvious that my interaction with the
     computer was of an extremely formal nature, being sharply divided up
     into different phases, viz.: (1) sitting at home with paper and
     pencil, miles and miles from any computer, I would think very, very
     hard about what I wanted the computer to do, and translate my
     intentions into a computer language--a series of alphanumeric
     symbols on a page. (2) I would carry this across a sort of
     informational cordon sanitaire (three miles of snowdrifts) to school
     and type those letters into a machine--not a computer--which would
     convert the symbols into binary numbers and record them visibly on a
     tape. (3) Then, through the rubber-cup modem, I would cause those
     numbers to be sent to the university mainframe, which would (4) do
     arithmetic on them and send different numbers back to the teletype.
     (5) The teletype would convert these numbers back into letters and
     hammer them out on a page and (6) I, watching, would construe the
     letters as meaningful symbols.

     The division of responsibilities implied by all of this is admirably
     clean: computers do arithmetic on bits of information. Humans
     construe the bits as meaningful symbols. But this distinction is now
     being blurred, or at least complicated, by the advent of modern
     operating systems that use, and frequently abuse, the power of
     metaphor to make computers accessible to a larger audience. Along
     the way--possibly because of those metaphors, which make an
     operating system a sort of work of art--people start to get
     emotional, and grow attached to pieces of software in the way that
     my friend's dad did to his MGB.

     People who have only interacted with computers through graphical
     user interfaces like the MacOS or Windows--which is to say, almost
     everyone who has ever used a computer--may have been startled, or at
     least bemused, to hear about the telegraph machine that I used to
     communicate with a computer in 1973. But there was, and is, a good
     reason for using this particular kind of technology. Human beings
     have various ways of communicating to each other, such as music,
     art, dance, and facial expressions, but some of these are more
     amenable than others to being expressed as strings of symbols.
     Written language is the easiest of all, because, of course, it
     consists of strings of symbols to begin with. If the symbols happen
     to belong to a phonetic alphabet (as opposed to, say, ideograms),
     converting them into bits is a trivial procedure, and one that was
     nailed, technologically, in the early nineteenth century, with the
     introduction of Morse code and other forms of telegraphy.

     We had a human/computer interface a hundred years before we had
     computers. When computers came into being around the time of the
     Second World War, humans, quite naturally, communicated with them by
     simply grafting them on to the already-existing technologies for
     translating letters into bits and vice versa: teletypes and punch
     card machines.

     These embodied two fundamentally different approaches to computing.
     When you were using cards, you'd punch a whole stack of them and run
     them through the reader all at once, which was called batch
     processing. You could also do batch processing with a teletype, as I
     have already described, by using the paper tape reader, and we were
     certainly encouraged to use this approach when I was in high school.
     But--though efforts were made to keep us unaware of this--the
     teletype could do something that the card reader could not. On the
     teletype, once the modem link was established, you could just type
     in a line and hit the return key. The teletype would send that line
     to the computer, which might or might not respond with some lines of
     its own, which the teletype would hammer out--producing, over time,
     a transcript of your exchange with the machine. This way of doing it
     did not even have a name at the time, but when, much later, an
     alternative became available, it was retroactively dubbed the
     Command Line Interface.

     When I moved on to college, I did my computing in large, stifling
     rooms where scores of students would sit in front of slightly
     updated versions of the same machines and write computer programs:
     these used dot-matrix printing mechanisms, but were (from the
     computer's point of view) identical to the old teletypes. By that
     point, computers were better at time-sharing--that is, mainframes
     were still mainframes, but they were better at communicating with a
     large number of terminals at once. Consequently, it was no longer
     necessary to use batch processing. Card readers were shoved out into
     hallways and boiler rooms, and batch processing became a nerds-only
     kind of thing, and consequently took on a certain eldritch flavor
     among those of us who even knew it existed. We were all off the
     Batch, and on the Command Line, interface now--my very first shift
     in operating system paradigms, if only I'd known it.

     A huge stack of accordion-fold paper sat on the floor underneath
     each one of these glorified teletypes, and miles of paper shuddered
     through their platens. Almost all of this paper was thrown away or
     recycled without ever having been touched by ink--an ecological
     atrocity so glaring that those machines soon replaced by video
     terminals--so-called "glass teletypes"--which were quieter and
     didn't waste paper. Again, though, from the computer's point of view
     these were indistinguishable from World War II-era teletype
     machines. In effect we still used Victorian technology to
     communicate with computers until about 1984, when the Macintosh was
     introduced with its Graphical User Interface. Even after that, the
     Command Line continued to exist as an underlying stratum--a sort of
     brainstem reflex--of many modern computer systems all through the
     heyday of Graphical User Interfaces, or GUIs as I will call them
     from now on.

     GUIs

     Now the first job that any coder needs to do when writing a new
     piece of software is to figure out how to take the information that
     is being worked with (in a graphics program, an image; in a
     spreadsheet, a grid of numbers) and turn it into a linear string of
     bytes. These strings of bytes are commonly called files or (somewhat
     more hiply) streams. They are to telegrams what modern humans are to
     Cro-Magnon man, which is to say the same thing under a different
     name. All that you see on your computer screen--your Tomb Raider,
     your digitized voice mail messages, faxes, and word processing
     documents written in thirty-seven different typefaces--is still,
     from the computer's point of view, just like telegrams, except much
     longer, and demanding of more arithmetic.

     The quickest way to get a taste of this is to fire up your web
     browser, visit a site, and then select the View/Document Source menu
     item. You will get a bunch of computer code that looks something
     like this:
<HTML>

<HEAD>
<TITLE>Welcome to the Avon Books Homepage</TITLE>
</HEAD>

<MAP NAME="left0199">
     <AREA SHAPE="rect" COORDS="16,56,111,67" HREF="/bard/">
     <AREA SHAPE="rect" COORDS="14,77,111,89" HREF="/eos/">
     <AREA SHAPE="rect" COORDS="17,98,112,110" HREF="/twilight/">
     <AREA SHAPE="rect" COORDS="18,119,112,131"
        HREF="/avon_user/category.html?category_id=271">
     <AREA SHAPE="rect" COORDS="19,140,112,152"
        HREF="http://www.goners.com/">
     <AREA SHAPE="rect" COORDS="18,161,111,173"
        HREF="http://www.spikebooks.com/">
     <AREA SHAPE="rect" COORDS="2,181,112,195"
        HREF="/avon_user/category.html?category_id=277">
     <AREA SHAPE="rect" COORDS="9,203,112,216"
        HREF="/chathamisland/">
     <AREA SHAPE="rect" COORDS="7,223,112,236"
        HREF="/avon_user/search.html">
</MAP>

<BODY TEXT="#478CFF" LINK="#FFFFFF" VLINK="#000000" ALINK="#478CFF" BGCOLOR="#00
3399">
<TABLE BORDER="0" WIDTH="600" CELLPADDING="0" CELLSPACING="0">

<TR VALIGN=TOP>

     <TD ROWSPAN="3">
     <A HREF="/cgi-bin/imagemap/maps/left.gif.map"><IMG
SRC="/avon/images/home/nav/left0199.gif" WIDTH="113" HEIGHT="280"
BORDER="0" USEMAP="#left0199"></A></TD><TD ROWSPAN="3"><IMG
SRC="/avon/images/home/homepagejan98/2ndleft.gif" WIDTH="144"
HEIGHT="280" BORDER="0"></TD><TD><A HREF="/avon/about.html"><IMG
SRC="/avon/images/home/homepagejan98/aboutavon.gif" ALT="About
Avon Books" WIDTH="199" HEIGHT="44" BORDER="0"></A></TD><TD
ROWSPAN="3"><A HREF="/avon/fiction/guides.html"><IMG
SRC="/avon/images/home/feb98/right1.gif" ALT="Reading Groups"
WIDTH="165" HEIGHT="121" BORDER="0"></A><BR><A
HREF="/avon/feature/feb99/crook.html"><IMG
SRC="/avon/images/home/feb99/crook_text.gif" ALT="The Crook
Factory" WIDTH="165" HEIGHT="96" BORDER="0"></A><BR><A
HREF="http://apps.hearstnewmedia.com/cgi-bin/gx.cgi/AppLogic+APPSSURVEYS
Questionnaire?domain_id=182&survey_id=541"><IMG
SRC="/avon/images/home/feb99/env_text.gif" ALT="The Envelope
Please" WIDTH="165" HEIGHT="63" BORDER="0"></A></TD>
</TR>

<TR VALIGN=TOP><TD><IMG SRC="/avon/images/home/feb98/main.gif"
WIDTH="199" HEIGHT="182" BORDER="0"></TD></TR><TR
VALIGN=TOP><TD><A HREF="/avon/feature/jan99/sitchin.html"><IMG
SRC="/avon/images/home/jan99/sitchin_text.gif" WIDTH="199"
HEIGHT="54" BORDER="0"></A></TD></TR><TR VALIGN=TOP><TD
COLSPAN="4"><IMG
SRC="/avon/images/home/jan99/avon_bottom_beau.gif" WIDTH="622"
HEIGHT="179" BORDER="0" USEMAP="#bottom"></TD></TR><TR><TD
ALIGN=CENTER VALIGN=TOP COLSPAN="4"><FONT SIZE="2"
FACE="ARIAL,COURIER"><PRE>

</PRE><A HREF="/avon/ordering.html">How to order</A> | <A
HREF="/avon/faq.html#manu">How to submit a Manuscript</A> | <A
HREF="mailto:avonweb@hearst.com">Contact us</A> | <A
HREF="/avon/policy.html">Privacy Policy</A></FONT>

<P>
</FONT></TD>

</TR>

</TABLE>

</BODY>

</HTML>

     This crud is called HTML (HyperText Markup Language) and it is
     basically a very simple programming language instructing your web
     browser how to draw a page on a screen. Anyone can learn HTML and
     many people do. The important thing is that no matter what splendid
     multimedia web pages they might represent, HTML files are just
     telegrams.

     When Ronald Reagan was a radio announcer, he used to call baseball
     games by reading the terse descriptions that trickled in over the
     telegraph wire and were printed out on a paper tape. He would sit
     there, all by himself in a padded room with a microphone, and the
     paper tape would eke out of the machine and crawl over the palm of
     his hand printed with cryptic abbreviations. If the count went to
     three and two, Reagan would describe the scene as he saw it in his
     mind's eye: "The brawny left-hander steps out of the batter's box to
     wipe the sweat from his brow. The umpire steps forward to sweep the
     dirt from home plate." and so on. When the cryptogram on the paper
     tape announced a base hit, he would whack the edge of the table with
     a pencil, creating a little sound effect, and describe the arc of
     the ball as if he could actually see it. His listeners, many of whom
     presumably thought that Reagan was actually at the ballpark watching
     the game, would reconstruct the scene in their minds according to
     his descriptions.

     This is exactly how the World Wide Web works: the HTML files are the
     pithy description on the paper tape, and your Web browser is Ronald
     Reagan. The same is true of Graphical User Interfaces in general.

     So an OS is a stack of metaphors and abstractions that stands
     between you and the telegrams, and embodying various tricks the
     programmer used to convert the information you're working with--be
     it images, e-mail messages, movies, or word processing
     documents--into the necklaces of bytes that are the only things
     computers know how to work with. When we used actual telegraph
     equipment (teletypes) or their higher-tech substitutes ("glass
     teletypes," or the MS-DOS command line) to work with our computers,
     we were very close to the bottom of that stack. When we use most
     modern operating systems, though, our interaction with the machine
     is heavily mediated. Everything we do is interpreted and translated
     time and again as it works its way down through all of the metaphors
     and abstractions.

     The Macintosh OS was a revolution in both the good and bad senses of
     that word. Obviously it was true that command line interfaces were
     not for everyone, and that it would be a good thing to make
     computers more accessible to a less technical audience--if not for
     altruistic reasons, then because those sorts of people constituted
     an incomparably vaster market. It was clear the the Mac's engineers
     saw a whole new country stretching out before them; you could almost
     hear them muttering, "Wow! We don't have to be bound by files as
     linear streams of bytes anymore, vive la revolution, let's see how
     far we can take this!" No command line interface was available on
     the Macintosh; you talked to it with the mouse, or not at all. This
     was a statement of sorts, a credential of revolutionary purity. It
     seemed that the designers of the Mac intended to sweep Command Line
     Interfaces into the dustbin of history.

     My own personal love affair with the Macintosh began in the spring
     of 1984 in a computer store in Cedar Rapids, Iowa, when a friend of
     mine--coincidentally, the son of the MGB owner--showed me a
     Macintosh running MacPaint, the revolutionary drawing program. It
     ended in July of 1995 when I tried to save a big important file on
     my Macintosh Powerbook and instead instead of doing so, it
     annihilated the data so thoroughly that two different disk crash
     utility programs were unable to find any trace that it had ever
     existed. During the intervening ten years, I had a passion for the
     MacOS that seemed righteous and reasonable at the time but in
     retrospect strikes me as being exactly the same sort of goofy
     infatuation that my friend's dad had with his car.

     The introduction of the Mac triggered a sort of holy war in the
     computer world. Were GUIs a brilliant design innovation that made
     computers more human-centered and therefore accessible to the
     masses, leading us toward an unprecedented revolution in human
     society, or an insulting bit of audiovisual gimcrackery dreamed up
     by flaky Bay Area hacker types that stripped computers of their
     power and flexibility and turned the noble and serious work of
     computing into a childish video game?

     This debate actually seems more interesting to me today than it did
     in the mid-1980s. But people more or less stopped debating it when
     Microsoft endorsed the idea of GUIs by coming out with the first
     Windows. At this point, command-line partisans were relegated to the
     status of silly old grouches, and a new conflict was touched off,
     between users of MacOS and users of Windows.

     There was plenty to argue about. The first Macintoshes looked
     different from other PCs even when they were turned off: they
     consisted of one box containing both CPU (the part of the computer
     that does arithmetic on bits) and monitor screen. This was billed,
     at the time, as a philosophical statement of sorts: Apple wanted to
     make the personal computer into an appliance, like a toaster. But it
     also reflected the purely technical demands of running a graphical
     user interface. In a GUI machine, the chips that draw things on the
     screen have to be integrated with the computer's central processing
     unit, or CPU, to a far greater extent than is the case with
     command-line interfaces, which until recently didn't even know that
     they weren't just talking to teletypes.

     This distinction was of a technical and abstract nature, but it
     became clearer when the machine crashed (it is commonly the case
     with technologies that you can get the best insight about how they
     work by watching them fail). When everything went to hell and the
     CPU began spewing out random bits, the result, on a CLI machine, was
     lines and lines of perfectly formed but random characters on the
     screen--known to cognoscenti as "going Cyrillic." But to the MacOS,
     the screen was not a teletype, but a place to put graphics; the
     image on the screen was a bitmap, a literal rendering of the
     contents of a particular portion of the computer's memory. When the
     computer crashed and wrote gibberish into the bitmap, the result was
     something that looked vaguely like static on a broken television
     set--a "snow crash."

     And even after the introduction of Windows, the underlying
     differences endured; when a Windows machine got into trouble, the
     old command-line interface would fall down over the GUI like an
     asbestos fire curtain sealing off the proscenium of a burning opera.
     When a Macintosh got into trouble it presented you with a cartoon of
     a bomb, which was funny the first time you saw it.

     And these were by no means superficial differences. The reversion of
     Windows to a CLI when it was in distress proved to Mac partisans
     that Windows was nothing more than a cheap facade, like a garish
     afghan flung over a rotted-out sofa. They were disturbed and annoyed
     by the sense that lurking underneath Windows' ostensibly
     user-friendly interface was--literally--a subtext.

     For their part, Windows fans might have made the sour observation
     that all computers, even Macintoshes, were built on that same
     subtext, and that the refusal of Mac owners to admit that fact to
     themselves seemed to signal a willingness, almost an eagerness, to
     be duped.

     Anyway, a Macintosh had to switch individual bits in the memory
     chips on the video card, and it had to do it very fast, and in
     arbitrarily complicated patterns. Nowadays this is cheap and easy,
     but in the technological regime that prevailed in the early 1980s,
     the only realistic way to do it was to build the motherboard (which
     contained the CPU) and the video system (which contained the memory
     that was mapped onto the screen) as a tightly integrated
     whole--hence the single, hermetically sealed case that made the
     Macintosh so distinctive.

     When Windows came out, it was conspicuous for its ugliness, and its
     current successors, Windows 95 and Windows NT, are not things that
     people would pay money to look at either. Microsoft's complete
     disregard for aesthetics gave all of us Mac-lovers plenty of
     opportunities to look down our noses at them. That Windows looked an
     awful lot like a direct ripoff of MacOS gave us a burning sense of
     moral outrage to go with it. Among people who really knew and
     appreciated computers (hackers, in Steven Levy's non-pejorative
     sense of that word) and in a few other niches such as professional
     musicians, graphic artists and schoolteachers, the Macintosh, for a
     while, was simply the computer. It was seen as not only a superb
     piece of engineering, but an embodiment of certain ideals about the
     use of technology to benefit mankind, while Windows was seen as a
     pathetically clumsy imitation and a sinister world domination plot
     rolled into one. So very early, a pattern had been established that
     endures to this day: people dislike Microsoft, which is okay; but
     they dislike it for reasons that are poorly considered, and in the
     end, self-defeating.

     CLASS STRUGGLE ON THE DESKTOP

     Now that the Third Rail has been firmly grasped, it is worth
     reviewing some basic facts here: like any other publicly traded,
     for-profit corporation, Microsoft has, in effect, borrowed a bunch
     of money from some people (its stockholders) in order to be in the
     bit business. As an officer of that corporation, Bill Gates has one
     responsibility only, which is to maximize return on investment. He
     has done this incredibly well. Any actions taken in the world by
     Microsoft-any software released by them, for example--are basically
     epiphenomena, which can't be interpreted or understood except
     insofar as they reflect Bill Gates's execution of his one and only
     responsibility.

     It follows that if Microsoft sells goods that are aesthetically
     unappealing, or that don't work very well, it does not mean that
     they are (respectively) philistines or half-wits. It is because
     Microsoft's excellent management has figured out that they can make
     more money for their stockholders by releasing stuff with obvious,
     known imperfections than they can by making it beautiful or
     bug-free. This is annoying, but (in the end) not half so annoying as
     watching Apple inscrutably and relentlessly destroy itself.

     Hostility towards Microsoft is not difficult to find on the Net, and
     it blends two strains: resentful people who feel Microsoft is too
     powerful, and disdainful people who think it's tacky. This is all
     strongly reminiscent of the heyday of Communism and Socialism, when
     the bourgeoisie were hated from both ends: by the proles, because
     they had all the money, and by the intelligentsia, because of their
     tendency to spend it on lawn ornaments. Microsoft is the very
     embodiment of modern high-tech prosperity--it is, in a word,
     bourgeois--and so it attracts all of the same gripes.

     The opening "splash screen" for Microsoft Word 6.0 summed it up
     pretty neatly: when you started up the program you were treated to a
     picture of an expensive enamel pen lying across a couple of sheets
     of fancy-looking handmade writing paper. It was obviously a bid to
     make the software look classy, and it might have worked for some,
     but it failed for me, because the pen was a ballpoint, and I'm a
     fountain pen man. If Apple had done it, they would've used a Mont
     Blanc fountain pen, or maybe a Chinese calligraphy brush. And I
     doubt that this was an accident. Recently I spent a while
     re-installing Windows NT on one of my home computers, and many times
     had to double-click on the "Control Panel" icon. For reasons that
     are difficult to fathom, this icon consists of a picture of a
     clawhammer and a chisel or screwdriver resting on top of a file
     folder.

     These aesthetic gaffes give one an almost uncontrollable urge to
     make fun of Microsoft, but again, it is all beside the point--if
     Microsoft had done focus group testing of possible alternative
     graphics, they probably would have found that the average mid-level
     office worker associated fountain pens with effete upper management
     toffs and was more comfortable with ballpoints. Likewise, the
     regular guys, the balding dads of the world who probably bear the
     brunt of setting up and maintaining home computers, can probably
     relate better to a picture of a clawhammer--while perhaps harboring
     fantasies of taking a real one to their balky computers.

     This is the only way I can explain certain peculiar facts about the
     current market for operating systems, such as that ninety percent of
     all customers continue to buy station wagons off the Microsoft lot
     while free tanks are there for the taking, right across the street.

     A string of ones and zeroes was not a difficult thing for Bill Gates
     to distribute, one he'd thought of the idea. The hard part was
     selling it--reassuring customers that they were actually getting
     something in return for their money.

     Anyone who has ever bought a piece of software in a store has had
     the curiously deflating experience of taking the bright
     shrink-wrapped box home, tearing it open, finding that it's 95
     percent air, throwing away all the little cards, party favors, and
     bits of trash, and loading the disk into the computer. The end
     result (after you've lost the disk) is nothing except some images on
     a computer screen, and some capabilities that weren't there before.
     Sometimes you don't even have that--you have a string of error
     messages instead. But your money is definitely gone. Now we are
     almost accustomed to this, but twenty years ago it was a very dicey
     business proposition. Bill Gates made it work anyway. He didn't make
     it work by selling the best software or offering the cheapest price.
     Instead he somehow got people to believe that they were receiving
     something in exchange for their money.

     The streets of every city in the world are filled with those
     hulking, rattling station wagons. Anyone who doesn't own one feels a
     little weird, and wonders, in spite of himself, whether it might not
     be time to cease resistance and buy one; anyone who does, feels
     confident that he has acquired some meaningful possession, even on
     those days when the vehicle is up on a lift in an auto repair shop.

     All of this is perfectly congruent with membership in the
     bourgeoisie, which is as much a mental, as a material state. And it
     explains why Microsoft is regularly attacked, on the Net, from both
     sides. People who are inclined to feel poor and oppressed construe
     everything Microsoft does as some sinister Orwellian plot. People
     who like to think of themselves as intelligent and informed
     technology users are driven crazy by the clunkiness of Windows.

     Nothing is more annoying to sophisticated people to see someone who
     is rich enough to know better being tacky--unless it is to realize,
     a moment later, that they probably know they are tacky and they
     simply don't care and they are going to go on being tacky, and rich,
     and happy, forever. Microsoft therefore bears the same relationship
     to the Silicon Valley elite as the Beverly Hillbillies did to their
     fussy banker, Mr. Drysdale--who is irritated not so much by the fact
     that the Clampetts moved to his neighborhood as by the knowledge
     that, when Jethro is seventy years old, he's still going to be
     talking like a hillbilly and wearing bib overalls, and he's still
     going to be a lot richer than Mr. Drysdale.

     Even the hardware that Windows ran on, when compared to the machines
     put out by Apple, looked like white-trash stuff, and still mostly
     does. The reason was that Apple was and is a hardware company, while
     Microsoft was and is a software company. Apple therefore had a
     monopoly on hardware that could run MacOS, whereas
     Windows-compatible hardware came out of a free market. The free
     market seems to have decided that people will not pay for
     cool-looking computers; PC hardware makers who hire designers to
     make their stuff look distinctive get their clocks cleaned by
     Taiwanese clone makers punching out boxes that look as if they
     belong on cinderblocks in front of someone's trailer. But Apple
     could make their hardware as pretty as they wanted to and simply
     pass the higher prices on to their besotted consumers, like me. Only
     last week (I am writing this sentence in early Jan. 1999) the
     technology sections of all the newspapers were filled with adulatory
     press coverage of how Apple had released the iMac in several
     happenin' new colors like Blueberry and Tangerine.

     Apple has always insisted on having a hardware monopoly, except for
     a brief period in the mid-1990s when they allowed clone-makers to
     compete with them, before subsequently putting them out of business.
     Macintosh hardware was, consequently, expensive. You didn't open it
     up and fool around with it because doing so would void the warranty.
     In fact the first Mac was specifically designed to be difficult to
     open--you needed a kit of exotic tools, which you could buy through
     little ads that began to appear in the back pages of magazines a few
     months after the Mac came out on the market. These ads always had a
     certain disreputable air about them, like pitches for lock-picking
     tools in the backs of lurid detective magazines.

     This monopolistic policy can be explained in at least three
     different ways.

     THE CHARITABLE EXPLANATION is that the hardware monopoly policy
     reflected a drive on Apple's part to provide a seamless, unified
     blending of hardware, operating system, and software. There is
     something to this. It is hard enough to make an OS that works well
     on one specific piece of hardware, designed and tested by engineers
     who work down the hallway from you, in the same company. Making an
     OS to work on arbitrary pieces of hardware, cranked out by rabidly
     entrepeneurial clonemakers on the other side of the International
     Date Line, is very difficult, and accounts for much of the troubles
     people have using Windows.

     THE FINANCIAL EXPLANATION is that Apple, unlike Microsoft, is and
     always has been a hardware company. It simply depends on revenue
     from selling hardware, and cannot exist without it.

     THE NOT-SO-CHARITABLE EXPLANATION has to do with Apple's corporate
     culture, which is rooted in Bay Area Baby Boomdom.

     Now, since I'm going to talk for a moment about culture, full
     disclosure is probably in order, to protect myself against
     allegations of conflict of interest and ethical turpitude: (1)
     Geographically I am a Seattleite, of a Saturnine temperament, and
     inclined to take a sour view of the Dionysian Bay Area, just as they
     tend to be annoyed and appalled by us. (2) Chronologically I am a
     post-Baby Boomer. I feel that way, at least, because I never
     experienced the fun and exciting parts of the whole Boomer
     scene--just spent a lot of time dutifully chuckling at Boomers'
     maddeningly pointless anecdotes about just how stoned they got on
     various occasions, and politely fielding their assertions about how
     great their music was. But even from this remove it was possible to
     glean certain patterns, and one that recurred as regularly as an
     urban legend was the one about how someone would move into a commune
     populated by sandal-wearing, peace-sign flashing flower children,
     and eventually discover that, underneath this facade, the guys who
     ran it were actually control freaks; and that, as living in a
     commune, where much lip service was paid to ideals of peace, love
     and harmony, had deprived them of normal, socially approved outlets
     for their control-freakdom, it tended to come out in other,
     invariably more sinister, ways.

     Applying this to the case of Apple Computer will be left as an
     exercise for the reader, and not a very difficult exercise.

     It is a bit unsettling, at first, to think of Apple as a control
     freak, because it is completely at odds with their corporate image.
     Weren't these the guys who aired the famous Super Bowl ads showing
     suited, blindfolded executives marching like lemmings off a cliff?
     Isn't this the company that even now runs ads picturing the Dalai
     Lama (except in Hong Kong) and Einstein and other offbeat rebels?

     It is indeed the same company, and the fact that they have been able
     to plant this image of themselves as creative and rebellious
     free-thinkers in the minds of so many intelligent and media-hardened
     skeptics really gives one pause. It is testimony to the insidious
     power of expensive slick ad campaigns and, perhaps, to a certain
     amount of wishful thinking in the minds of people who fall for them.
     It also raises the question of why Microsoft is so bad at PR, when
     the history of Apple demonstrates that, by writing large checks to
     good ad agencies, you can plant a corporate image in the minds of
     intelligent people that is completely at odds with reality. (The
     answer, for people who don't like Damoclean questions, is that since
     Microsoft has won the hearts and minds of the silent majority--the
     bourgeoisie--they don't give a damn about having a slick image, any
     more then Dick Nixon did. "I want to believe,"--the mantra that Fox
     Mulder has pinned to his office wall in The X-Files--applies in
     different ways to these two companies; Mac partisans want to believe
     in the image of Apple purveyed in those ads, and in the notion that
     Macs are somehow fundamentally different from other computers, while
     Windows people want to believe that they are getting something for
     their money, engaging in a respectable business transaction).

     In any event, as of 1987, both MacOS and Windows were out on the
     market, running on hardware platforms that were radically different
     from each other--not only in the sense that MacOS used Motorola CPU
     chips while Windows used Intel, but in the sense--then overlooked,
     but in the long run, vastly more significant--that the Apple
     hardware business was a rigid monopoly and the Windows side was a
     churning free-for-all.

     But the full ramifications of this did not become clear until very
     recently--in fact, they are still unfolding, in remarkably strange
     ways, as I'll explain when we get to Linux. The upshot is that
     millions of people got accustomed to using GUIs in one form or
     another. By doing so, they made Apple/Microsoft a lot of money. The
     fortunes of many people have become bound up with the ability of
     these companies to continue selling products whose salability is
     very much open to question.

     HONEY-POT, TAR-PIT, WHATEVER

     When Gates and Allen invented the idea of selling software, they ran
     into criticism from both hackers and sober-sided businesspeople.
     Hackers understood that software was just information, and objected
     to the idea of selling it. These objections were partly moral. The
     hackers were coming out of the scientific and academic world where
     it is imperative to make the results of one's work freely available
     to the public. They were also partly practical; how can you sell
     something that can be easily copied? Businesspeople, who are polar
     opposites of hackers in so many ways, had objections of their own.
     Accustomed to selling toasters and insurance policies, they
     naturally had a difficult time understanding how a long collection
     of ones and zeroes could constitute a salable product.

     Obviously Microsoft prevailed over these objections, and so did
     Apple. But the objections still exist. The most hackerish of all the
     hackers, the Ur-hacker as it were, was and is Richard Stallman, who
     became so annoyed with the evil practice of selling software that,
     in 1984 (the same year that the Macintosh went on sale) he went off
     and founded something called the Free Software Foundation, which
     commenced work on something called GNU. Gnu is an acronym for Gnu's
     Not Unix, but this is a joke in more ways than one, because GNU most
     certainly IS Unix,. Because of trademark concerns ("Unix" is
     trademarked by AT&T) they simply could not claim that it was Unix,
     and so, just to be extra safe, they claimed that it wasn't.
     Notwithstanding the incomparable talent and drive possessed by Mr.
     Stallman and other GNU adherents, their project to build a free Unix
     to compete against Microsoft and Apple's OSes was a little bit like
     trying to dig a subway system with a teaspoon. Until, that is, the
     advent of Linux, which I will get to later.

     But the basic idea of re-creating an operating system from scratch
     was perfectly sound and completely doable. It has been done many
     times. It is inherent in the very nature of operating systems.

     Operating systems are not strictly necessary. There is no reason why
     a sufficiently dedicated coder could not start from nothing with
     every project and write fresh code to handle such basic, low-level
     operations as controlling the read/write heads on the disk drives
     and lighting up pixels on the screen. The very first computers had
     to be programmed in this way. But since nearly every program needs
     to carry out those same basic operations, this approach would lead
     to vast duplication of effort.

     Nothing is more disagreeable to the hacker than duplication of
     effort. The first and most important mental habit that people
     develop when they learn how to write computer programs is to
     generalize, generalize, generalize. To make their code as modular
     and flexible as possible, breaking large problems down into small
     subroutines that can be used over and over again in different
     contexts. Consequently, the development of operating systems,
     despite being technically unnecessary, was inevitable. Because at
     its heart, an operating system is nothing more than a library
     containing the most commonly used code, written once (and hopefully
     written well) and then made available to every coder who needs it.

     So a proprietary, closed, secret operating system is a contradiction
     in terms. It goes against the whole point of having an operating
     system. And it is impossible to keep them secret anyway. The source
     code--the original lines of text written by the programmers--can be
     kept secret. But an OS as a whole is a collection of small
     subroutines that do very specific, very clearly defined jobs.
     Exactly what those subroutines do has to be made public, quite
     explicitly and exactly, or else the OS is completely useless to
     programmers; they can't make use of those subroutines if they don't
     have a complete and perfect understanding of what the subroutines
     do.

     The only thing that isn't made public is exactly how the subroutines
     do what they do. But once you know what a subroutine does, it's
     generally quite easy (if you are a hacker) to write one of your own
     that does exactly the same thing. It might take a while, and it is
     tedious and unrewarding, but in most cases it's not really hard.

     What's hard, in hacking as in fiction, is not writing; it's deciding
     what to write. And the vendors of commercial OSes have already
     decided, and published their decisions.

     This has been generally understood for a long time. MS-DOS was
     duplicated, functionally, by a rival product, written from scratch,
     called ProDOS, that did all of the same things in pretty much the
     same way. In other words, another company was able to write code
     that did all of the same things as MS-DOS and sell it at a profit.
     If you are using the Linux OS, you can get a free program called
     WINE which is a windows emulator; that is, you can open up a window
     on your desktop that runs windows programs. It means that a
     completely functional Windows OS has been recreated inside of Unix,
     like a ship in a bottle. And Unix itself, which is vastly more
     sophisticated than MS-DOS, has been built up from scratch many times
     over. Versions of it are sold by Sun, Hewlett-Packard, AT&T, Silicon
     Graphics, IBM, and others.

     People have, in other words, been re-writing basic OS code for so
     long that all of the technology that constituted an "operating
     system" in the traditional (pre-GUI) sense of that phrase is now so
     cheap and common that it's literally free. Not only could Gates and
     Allen not sell MS-DOS today, they could not even give it away,
     because much more powerful OSes are already being given away. Even
     the original Windows (which was the only windows until 1995) has
     become worthless, in that there is no point in owning something that
     can be emulated inside of Linux--which is, itself, free.

     In this way the OS business is very different from, say, the car
     business. Even an old rundown car has some value. You can use it for
     making runs to the dump, or strip it for parts. It is the fate of
     manufactured goods to slowly and gently depreciate as they get old
     and have to compete against more modern products.

     But it is the fate of operating systems to become free.

     Microsoft is a great software applications company.
     Applications--such as Microsoft Word--are an area where innovation
     brings real, direct, tangible benefits to users. The innovations
     might be new technology straight from the research department, or
     they might be in the category of bells and whistles, but in any
     event they are frequently useful and they seem to make users happy.
     And Microsoft is in the process of becoming a great research
     company. But Microsoft is not such a great operating systems
     company. And this is not necessarily because their operating systems
     are all that bad from a purely technological standpoint. Microsoft's
     OSes do have their problems, sure, but they are vastly better than
     they used to be, and they are adequate for most people.

     Why, then, do I say that Microsoft is not such a great operating
     systems company? Because the very nature of operating systems is
     such that it is senseless for them to be developed and owned by a
     specific company. It's a thankless job to begin with. Applications
     create possibilities for millions of credulous users, whereas OSes
     impose limitations on thousands of grumpy coders, and so OS-makers
     will forever be on the shit-list of anyone who counts for anything
     in the high-tech world. Applications get used by people whose big
     problem is understanding all of their features, whereas OSes get
     hacked by coders who are annoyed by their limitations. The OS
     business has been good to Microsoft only insofar as it has given
     them the money they needed to launch a really good applications
     software business and to hire a lot of smart researchers. Now it
     really ought to be jettisoned, like a spent booster stage from a
     rocket. The big question is whether Microsoft is capable of doing
     this. Or is it addicted to OS sales in the same way as Apple is to
     selling hardware?

     Keep in mind that Apple's ability to monopolize its own hardware
     supply was once cited, by learned observers, as a great advantage
     over Microsoft. At the time, it seemed to place them in a much
     stronger position. In the end, it nearly killed them, and may kill
     them yet. The problem, for Apple, was that most of the world's
     computer users ended up owning cheaper hardware. But cheap hardware
     couldn't run MacOS, and so these people switched to Windows.

     Replace "hardware" with "operating systems," and "Apple" with
     "Microsoft" and you can see the same thing about to happen all over
     again. Microsoft dominates the OS market, which makes them money and
     seems like a great idea for now. But cheaper and better OSes are
     available, and they are growingly popular in parts of the world that
     are not so saturated with computers as the US. Ten years from now,
     most of the world's computer users may end up owning these cheaper
     OSes. But these OSes do not, for the time being, run any Microsoft
     applications, and so these people will use something else.

     To put it more directly: every time someone decides to use a
     non-Microsoft OS, Microsoft's OS division, obviously, loses a
     customer. But, as things stand now, Microsoft's applications
     division loses a customer too. This is not such a big deal as long
     as almost everyone uses Microsoft OSes. But as soon as Windows'
     market share begins to slip, the math starts to look pretty dismal
     for the people in Redmond.

     This argument could be countered by saying that Microsoft could
     simply re-compile its applications to run under other OSes. But this
     strategy goes against most normal corporate instincts. Again the
     case of Apple is instructive. When things started to go south for
     Apple, they should have ported their OS to cheap PC hardware. But
     they didn't. Instead, they tried to make the most of their brilliant
     hardware, adding new features and expanding the product line. But
     this only had the effect of making their OS more dependent on these
     special hardware features, which made it worse for them in the end.

     Likewise, when Microsoft's position in the OS world is threatened,
     their corporate instincts will tell them to pile more new features
     into their operating systems, and then re-jigger their software
     applications to exploit those special features. But this will only
     have the effect of making their applications dependent on an OS with
     declining market share, and make it worse for them in the end.

     The operating system market is a death-trap, a tar-pit, a slough of
     despond. There are only two reasons to invest in Apple and
     Microsoft. (1) each of these companies is in what we would call a
     co-dependency relationship with their customers. The customers Want
     To Believe, and Apple and Microsoft know how to give them what they
     want. (2) each company works very hard to add new features to their
     OSes, which works to secure customer loyalty, at least for a little
     while.

     Accordingly, most of the remainder of this essay will be about those
     two topics.

     THE TECHNOSPHERE

     Unix is the only OS remaining whose GUI (a vast suite of code called
     the X Windows System) is separate from the OS in the old sense of
     the phrase. This is to say that you can run Unix in pure
     command-line mode if you want to, with no windows, icons, mouses,
     etc. whatsoever, and it will still be Unix and capable of doing
     everything Unix is supposed to do. But the other OSes: MacOS, the
     Windows family, and BeOS, have their GUIs tangled up with the
     old-fashioned OS functions to the extent that they have to run in
     GUI mode, or else they are not really running. So it's no longer
     really possible to think of GUIs as being distinct from the OS;
     they're now an inextricable part of the OSes that they belong
     to--and they are by far the largest part, and by far the most
     expensive and difficult part to create.

     There are only two ways to sell a product: price and features. When
     OSes are free, OS companies cannot compete on price, and so they
     compete on features. This means that they are always trying to outdo
     each other writing code that, until recently, was not considered to
     be part of an OS at all: stuff like GUIs. This explains a lot about
     how these companies behave.

     It explains why Microsoft added a browser to their OS, for example.
     It is easy to get free browsers, just as to get free OSes. If
     browsers are free, and OSes are free, it would seem that there is no
     way to make money from browsers or OSes. But if you can integrate a
     browser into the OS and thereby imbue both of them with new
     features, you have a salable product.

     Setting aside, for the moment, the fact that this makes government
     anti-trust lawyers really mad, this strategy makes sense. At least,
     it makes sense if you assume (as Microsoft's management appears to)
     that the OS has to be protected at all costs. The real question is
     whether every new technological trend that comes down the pike ought
     to be used as a crutch to maintain the OS's dominant position.
     Confronted with the Web phenomenon, Microsoft had to develop a
     really good web browser, and they did. But then they had a choice:
     they could have made that browser work on many different OSes, which
     would give Microsoft a strong position in the Internet world no
     matter what happened to their OS market share. Or they could make
     the browser one with the OS, gambling that this would make the OS
     look so modern and sexy that it would help to preserve their
     dominance in that market. The problem is that when Microsoft's OS
     position begins to erode (and since it is currently at something
     like ninety percent, it can't go anywhere but down) it will drag
     everything else down with it.

     In your high school geology class you probably were taught that all
     life on earth exists in a paper-thin shell called the biosphere,
     which is trapped between thousands of miles of dead rock underfoot,
     and cold dead radioactive empty space above. Companies that sell
     OSes exist in a sort of technosphere. Underneath is technology that
     has already become free. Above is technology that has yet to be
     developed, or that is too crazy and speculative to be productized
     just yet. Like the Earth's biosphere, the technosphere is very thin
     compared to what is above and what is below.

     But it moves a lot faster. In various parts of our world, it is
     possible to go and visit rich fossil beds where skeleton lies piled
     upon skeleton, recent ones on top and more ancient ones below. In
     theory they go all the way back to the first single-celled
     organisms. And if you use your imagination a bit, you can understand
     that, if you hang around long enough, you'll become fossilized there
     too, and in time some more advanced organism will become fossilized
     on top of you.

     The fossil record--the La Brea Tar Pit--of software technology is
     the Internet. Anything that shows up there is free for the taking
     (possibly illegal, but free). Executives at companies like Microsoft
     must get used to the experience--unthinkable in other industries--of
     throwing millions of dollars into the development of new
     technologies, such as Web browsers, and then seeing the same or
     equivalent software show up on the Internet two years, or a year, or
     even just a few months, later.

     By continuing to develop new technologies and add features onto
     their products they can keep one step ahead of the fossilization
     process, but on certain days they must feel like mammoths caught at
     La Brea, using all their energies to pull their feet, over and over
     again, out of the sucking hot tar that wants to cover and envelop
     them.

     Survival in this biosphere demands sharp tusks and heavy, stomping
     feet at one end of the organization, and Microsoft famously has
     those. But trampling the other mammoths into the tar can only keep
     you alive for so long. The danger is that in their obsession with
     staying out of the fossil beds, these companies will forget about
     what lies above the biosphere: the realm of new technology. In other
     words, they must hang onto their primitive weapons and crude
     competitive instincts, but also evolve powerful brains. This appears
     to be what Microsoft is doing with its research division, which has
     been hiring smart people right and left (Here I should mention that
     although I know, and socialize with, several people in that
     company's research division, we never talk about business issues and
     I have little to no idea what the hell they are up to. I have
     learned much more about Microsoft by using the Linux operating
     system than I ever would have done by using Windows).

     Never mind how Microsoft used to make money; today, it is making its
     money on a kind of temporal arbitrage. "Arbitrage," in the usual
     sense, means to make money by taking advantage of differences in the
     price of something between different markets. It is spatial, in
     other words, and hinges on the arbitrageur knowing what is going on
     simultaneously in different places. Microsoft is making money by
     taking advantage of differences in the price of technology in
     different times. Temporal arbitrage, if I may coin a phrase, hinges
     on the arbitrageur knowing what technologies people will pay money
     for next year, and how soon afterwards those same technologies will
     become free. What spatial and temporal arbitrage have in common is
     that both hinge on the arbitrageur's being extremely well-informed;
     one about price gradients across space at a given time, and the
     other about price gradients over time in a given place.

     So Apple/Microsoft shower new features upon their users almost
     daily, in the hopes that a steady stream of genuine technical
     innovations, combined with the "I want to believe" phenomenon, will
     prevent their customers from looking across the road towards the
     cheaper and better OSes that are available to them. The question is
     whether this makes sense in the long run. If Microsoft is addicted
     to OSes as Apple is to hardware, then they will bet the whole farm
     on their OSes, and tie all of their new applications and
     technologies to them. Their continued survival will then depend on
     these two things: adding more features to their OSes so that
     customers will not switch to the cheaper alternatives, and
     maintaining the image that, in some mysterious way, gives those
     customers the feeling that they are getting something for their
     money.

     The latter is a truly strange and interesting cultural phenomenon.

     THE INTERFACE CULTURE

     A few years ago I walked into a grocery store somewhere and was
     presented with the following tableau vivant: near the entrance a
     young couple were standing in front of a large cosmetics display.
     The man was stolidly holding a shopping basket between his hands
     while his mate raked blister-packs of makeup off the display and
     piled them in. Since then I've always thought of that man as the
     personification of an interesting human tendency: not only are we
     not offended to be dazzled by manufactured images, but we like it.
     We practically insist on it. We are eager to be complicit in our own
     dazzlement: to pay money for a theme park ride, vote for a guy who's
     obviously lying to us, or stand there holding the basket as it's
     filled up with cosmetics.

     I was in Disney World recently, specifically the part of it called
     the Magic Kingdom, walking up Main Street USA. This is a perfect
     gingerbready Victorian small town that culminates in a Disney
     castle. It was very crowded; we shuffled rather than walked.
     Directly in front of me was a man with a camcorder. It was one of
     the new breed of camcorders where instead of peering through a
     viewfinder you gaze at a flat-panel color screen about the size of a
     playing card, which televises live coverage of whatever the
     camcorder is seeing. He was holding the appliance close to his face,
     so that it obstructed his view. Rather than go see a real small town
     for free, he had paid money to see a pretend one, and rather than
     see it with the naked eye he was watching it on television.

     And rather than stay home and read a book, I was watching him.

     Americans' preference for mediated experiences is obvious enough,
     and I'm not going to keep pounding it into the ground. I'm not even
     going to make snotty comments about it--after all, I was at Disney
     World as a paying customer. But it clearly relates to the colossal
     success of GUIs and so I have to talk about it some. Disney does
     mediated experiences better than anyone. If they understood what
     OSes are, and why people use them, they could crush Microsoft in a
     year or two.

     In the part of Disney World called the Animal Kingdom there is a new
     attraction, slated to open in March 1999, called the Maharajah
     Jungle Trek. It was open for sneak previews when I was there. This
     is a complete stone-by-stone reproduction of a hypothetical ruin in
     the jungles of India. According to its backstory, it was built by a
     local rajah in the 16th Century as a game reserve. He would go there
     with his princely guests to hunt Bengal tigers. As time went on it
     fell into disrepair and the tigers and monkeys took it over;
     eventually, around the time of India's independence, it became a
     government wildlife reserve, now open to visitors.

     The place looks more like what I have just described than any actual
     building you might find in India. All the stones in the broken walls
     are weathered as if monsoon rains had been trickling down them for
     centuries, the paint on the gorgeous murals is flaked and faded just
     so, and Bengal tigers loll amid stumps of broken columns. Where
     modern repairs have been made to the ancient structure, they've been
     done, not as Disney's engineers would do them, but as thrifty Indian
     janitors would--with hunks of bamboo and rust-spotted hunks of
     rebar. The rust is painted on, or course, and protected from real
     rust by a plastic clear-coat, but you can't tell unless you get down
     on your knees.

     In one place you walk along a stone wall with a series of old pitted
     friezes carved into it. One end of the wall has broken off and
     settled into the earth, perhaps because of some long-forgotten
     earthquake, and so a broad jagged crack runs across a panel or two,
     but the story is still readable: first, primordial chaos leads to a
     flourishing of many animal species. Next, we see the Tree of Life
     surrounded by diverse animals. This is an obvious allusion (or, in
     showbiz lingo, a tie-in) to the gigantic Tree of Life that dominates
     the center of Disney's Animal Kingdom just as the Castle dominates
     the Magic Kingdom or the Sphere does Epcot. But it's rendered in
     historically correct style and could probably fool anyone who didn't
     have a Ph.D. in Indian art history.

     The next panel shows a mustachioed H. sapiens chopping down the Tree
     of Life with a scimitar, and the animals fleeing every which way.
     The one after that shows the misguided human getting walloped by a
     tidal wave, part of a latter-day Deluge presumably brought on by his
     stupidity.

     The final panel, then, portrays the Sapling of Life beginning to
     grow back, but now Man has ditched the edged weapon and joined the
     other animals in standing around to adore and praise it.

     It is, in other words, a prophecy of the Bottleneck: the scenario,
     commonly espoused among modern-day environmentalists, that the world
     faces an upcoming period of grave ecological tribulations that will
     last for a few decades or centuries and end when we find a new
     harmonious modus vivendi with Nature.

     Taken as a whole the frieze is a pretty brilliant piece of work.
     Obviously it's not an ancient Indian ruin, and some person or people
     now living deserve credit for it. But there are no signatures on the
     Maharajah's game reserve at Disney World. There are no signatures on
     anything, because it would ruin the whole effect to have long
     strings of production credits dangling from every custom-worn brick,
     as they do from Hollywood movies.

     Among Hollywood writers, Disney has the reputation of being a real
     wicked stepmother. It's not hard to see why. Disney is in the
     business of putting out a product of seamless illusion--a magic
     mirror that reflects the world back better than it really is. But a
     writer is literally talking to his or her readers, not just creating
     an ambience or presenting them with something to look at; and just
     as the command-line interface opens a much more direct and explicit
     channel from user to machine than the GUI, so it is with words,
     writer, and reader.

     The word, in the end, is the only system of encoding thoughts--the
     only medium--that is not fungible, that refuses to dissolve in the
     devouring torrent of electronic media (the richer tourists at Disney
     World wear t-shirts printed with the names of famous designers,
     because designs themselves can be bootlegged easily and with
     impunity. The only way to make clothing that cannot be legally
     bootlegged is to print copyrighted and trademarked words on it; once
     you have taken that step, the clothing itself doesn't really matter,
     and so a t-shirt is as good as anything else. T-shirts with
     expensive words on them are now the insignia of the upper class.
     T-shirts with cheap words, or no words at all, are for the
     commoners).

     But this special quality of words and of written communication would
     have the same effect on Disney's product as spray-painted graffiti
     on a magic mirror. So Disney does most of its communication without
     resorting to words, and for the most part, the words aren't missed.
     Some of Disney's older properties, such as Peter Pan, Winnie the
     Pooh, and Alice in Wonderland, came out of books. But the authors'
     names are rarely if ever mentioned, and you can't buy the original
     books at the Disney store. If you could, they would all seem old and
     queer, like very bad knockoffs of the purer, more authentic Disney
     versions. Compared to more recent productions like Beauty and the
     Beast and Mulan, the Disney movies based on these books
     (particularly Alice in Wonderland and Peter Pan) seem deeply
     bizarre, and not wholly appropriate for children. That stands to
     reason, because Lewis Carroll and J.M. Barrie were very strange men,
     and such is the nature of the written word that their personal
     strangeness shines straight through all the layers of Disneyfication
     like x-rays through a wall. Probably for this very reason, Disney
     seems to have stopped buying books altogether, and now finds its
     themes and characters in folk tales, which have the lapidary,
     time-worn quality of the ancient bricks in the Maharajah's ruins.

     If I can risk a broad generalization, most of the people who go to
     Disney World have zero interest in absorbing new ideas from books.
     Which sounds snide, but listen: they have no qualms about being
     presented with ideas in other forms. Disney World is stuffed with
     environmental messages now, and the guides at Animal Kingdom can
     talk your ear off about biology.

     If you followed those tourists home, you might find art, but it
     would be the sort of unsigned folk art that's for sale in Disney
     World's African- and Asian-themed stores. In general they only seem
     comfortable with media that have been ratified by great age, massive
     popular acceptance, or both.

     In this world, artists are like the anonymous, illiterate stone
     carvers who built the great cathedrals of Europe and then faded away
     into unmarked graves in the churchyard. The cathedral as a whole is
     awesome and stirring in spite, and possibly because, of the fact
     that we have no idea who built it. When we walk through it we are
     communing not with individual stone carvers but with an entire
     culture.

     Disney World works the same way. If you are an intellectual type, a
     reader or writer of books, the nicest thing you can say about this
     is that the execution is superb. But it's easy to find the whole
     environment a little creepy, because something is missing: the
     translation of all its content into clear explicit written words,
     the attribution of the ideas to specific people. You can't argue
     with it. It seems as if a hell of a lot might be being glossed over,
     as if Disney World might be putting one over on us, and possibly
     getting away with all kinds of buried assumptions and muddled
     thinking.

     But this is precisely the same as what is lost in the transition
     from the command-line interface to the GUI.

     Disney and Apple/Microsoft are in the same business:
     short-circuiting laborious, explicit verbal communication with
     expensively designed interfaces. Disney is a sort of user interface
     unto itself--and more than just graphical. Let's call it a Sensorial
     Interface. It can be applied to anything in the world, real or
     imagined, albeit at staggering expense.

     Why are we rejecting explicit word-based interfaces, and embracing
     graphical or sensorial ones--a trend that accounts for the success
     of both Microsoft and Disney?

     Part of it is simply that the world is very complicated now--much
     more complicated than the hunter-gatherer world that our brains
     evolved to cope with--and we simply can't handle all of the details.
     We have to delegate. We have no choice but to trust some nameless
     artist at Disney or programmer at Apple or Microsoft to make a few
     choices for us, close off some options, and give us a conveniently
     packaged executive summary.

     But more importantly, it comes out of the fact that, during this
     century, intellectualism failed, and everyone knows it. In places
     like Russia and Germany, the common people agreed to loosen their
     grip on traditional folkways, mores, and religion, and let the
     intellectuals run with the ball, and they screwed everything up and
     turned the century into an abbatoir. Those wordy intellectuals used
     to be merely tedious; now they seem kind of dangerous as well.

     We Americans are the only ones who didn't get creamed at some point
     during all of this. We are free and prosperous because we have
     inherited political and values systems fabricated by a particular
     set of eighteenth-century intellectuals who happened to get it
     right. But we have lost touch with those intellectuals, and with
     anything like intellectualism, even to the point of not reading
     books any more, though we are literate. We seem much more
     comfortable with propagating those values to future generations
     nonverbally, through a process of being steeped in media. Apparently
     this actually works to some degree, for police in many lands are now
     complaining that local arrestees are insisting on having their
     Miranda rights read to them, just like perps in American TV cop
     shows. When it's explained to them that they are in a different
     country, where those rights do not exist, they become outraged.
     Starsky and Hutch reruns, dubbed into diverse languages, may turn
     out, in the long run, to be a greater force for human rights than
     the Declaration of Independence.

     A huge, rich, nuclear-tipped culture that propagates its core values
     through media steepage seems like a bad idea. There is an obvious
     risk of running astray here. Words are the only immutable medium we
     have, which is why they are the vehicle of choice for extremely
     important concepts like the Ten Commandments, the Koran, and the
     Bill of Rights. Unless the messages conveyed by our media are
     somehow pegged to a fixed, written set of precepts, they can wander
     all over the place and possibly dump loads of crap into people's
     minds.

     Orlando used to have a military installation called McCoy Air Force
     Base, with long runways from which B-52s could take off and reach
     Cuba, or just about anywhere else, with loads of nukes. But now
     McCoy has been scrapped and repurposed. It has been absorbed into
     Orlando's civilian airport. The long runways are being used to land
     747-loads of tourists from Brazil, Italy, Russia and Japan, so that
     they can come to Disney World and steep in our media for a while.

     To traditional cultures, especially word-based ones such as Islam,
     this is infinitely more threatening than the B-52s ever were. It is
     obvious, to everyone outside of the United States, that our
     arch-buzzwords, multiculturalism and diversity, are false fronts
     that are being used (in many cases unwittingly) to conceal a global
     trend to eradicate cultural differences. The basic tenet of
     multiculturalism (or "honoring diversity" or whatever you want to
     call it) is that people need to stop judging each other-to stop
     asserting (and, eventually, to stop believing) that this is right
     and that is wrong, this true and that false, one thing ugly and
     another thing beautiful, that God exists and has this or that set of
     qualities.

     The lesson most people are taking home from the Twentieth Century is
     that, in order for a large number of different cultures to coexist
     peacefully on the globe (or even in a neighborhood) it is necessary
     for people to suspend judgment in this way. Hence (I would argue)
     our suspicion of, and hostility towards, all authority figures in
     modern culture. As David Foster Wallace has explained in his essay
     "E Unibus Pluram," this is the fundamental message of television; it
     is the message that people take home, anyway, after they have
     steeped in our media long enough. It's not expressed in these
     highfalutin terms, of course. It comes through as the presumption
     that all authority figures--teachers, generals, cops, ministers,
     politicians--are hypocritical buffoons, and that hip jaded coolness
     is the only way to be.

     The problem is that once you have done away with the ability to make
     judgments as to right and wrong, true and false, etc., there's no
     real culture left. All that remains is clog dancing and macrame. The
     ability to make judgments, to believe things, is the entire it point
     of having a culture. I think this is why guys with machine guns
     sometimes pop up in places like Luxor, and begin pumping bullets
     into Westerners. They perfectly understand the lesson of McCoy Air
     Force Base. When their sons come home wearing Chicago Bulls caps
     with the bills turned sideways, the dads go out of their minds.

     The global anti-culture that has been conveyed into every cranny of
     the world by television is a culture unto itself, and by the
     standards of great and ancient cultures like Islam and France, it
     seems grossly inferior, at least at first. The only good thing you
     can say about it is that it makes world wars and Holocausts less
     likely--and that is actually a pretty good thing!

     The only real problem is that anyone who has no culture, other than
     this global monoculture, is completely screwed. Anyone who grows up
     watching TV, never sees any religion or philosophy, is raised in an
     atmosphere of moral relativism, learns about civics from watching
     bimbo eruptions on network TV news, and attends a university where
     postmodernists vie to outdo each other in demolishing traditional
     notions of truth and quality, is going to come out into the world as
     one pretty feckless human being. And--again--perhaps the goal of all
     this is to make us feckless so we won't nuke each other.

     On the other hand, if you are raised within some specific culture,
     you end up with a basic set of tools that you can use to think about
     and understand the world. You might use those tools to reject the
     culture you were raised in, but at least you've got some tools.

     In this country, the people who run things--who populate major law
     firms and corporate boards--understand all of this at some level.
     They pay lip service to multiculturalism and diversity and
     non-judgmentalness, but they don't raise their own children that
     way. I have highly educated, technically sophisticated friends who
     have moved to small towns in Iowa to live and raise their children,
     and there are Hasidic Jewish enclaves in New York where large
     numbers of kids are being brought up according to traditional
     beliefs. Any suburban community might be thought of as a place where
     people who hold certain (mostly implicit) beliefs go to live among
     others who think the same way.

     And not only do these people feel some responsibility to their own
     children, but to the country as a whole. Some of the upper class are
     vile and cynical, of course, but many spend at least part of their
     time fretting about what direction the country is going in, and what
     responsibilities they have. And so issues that are important to
     book-reading intellectuals, such as global environmental collapse,
     eventually percolate through the porous buffer of mass culture and
     show up as ancient Hindu ruins in Orlando.

     You may be asking: what the hell does all this have to do with
     operating systems? As I've explained, there is no way to explain the
     domination of the OS market by Apple/Microsoft without looking to
     cultural explanations, and so I can't get anywhere, in this essay,
     without first letting you know where I'm coming from vis-a-vis
     contemporary culture.

     Contemporary culture is a two-tiered system, like the Morlocks and
     the Eloi in H.G. Wells's The Time Machine, except that it's been
     turned upside down. In The Time Machine the Eloi were an effete
     upper class, supported by lots of subterranean Morlocks who kept the
     technological wheels turning. But in our world it's the other way
     round. The Morlocks are in the minority, and they are running the
     show, because they understand how everything works. The much more
     numerous Eloi learn everything they know from being steeped from
     birth in electronic media directed and controlled by book-reading
     Morlocks. So many ignorant people could be dangerous if they got
     pointed in the wrong direction, and so we've evolved a popular
     culture that is (a) almost unbelievably infectious and (b) neuters
     every person who gets infected by it, by rendering them unwilling to
     make judgments and incapable of taking stands.

     Morlocks, who have the energy and intelligence to comprehend
     details, go out and master complex subjects and produce Disney-like
     Sensorial Interfaces so that Eloi can get the gist without having to
     strain their minds or endure boredom. Those Morlocks will go to
     India and tediously explore a hundred ruins, then come home and
     built sanitary bug-free versions: highlight films, as it were. This
     costs a lot, because Morlocks insist on good coffee and first-class
     airline tickets, but that's no problem because Eloi like to be
     dazzled and will gladly pay for it all.

     Now I realize that most of this probably sounds snide and bitter to
     the point of absurdity: your basic snotty intellectual throwing a
     tantrum about those unlettered philistines. As if I were a
     self-styled Moses, coming down from the mountain all alone, carrying
     the stone tablets bearing the Ten Commandments carved in immutable
     stone--the original command-line interface--and blowing his stack at
     the weak, unenlightened Hebrews worshipping images. Not only that,
     but it sounds like I'm pumping some sort of conspiracy theory.

     But that is not where I'm going with this. The situation I describe,
     here, could be bad, but doesn't have to be bad and isn't necessarily
     bad now:
     * It simply is the case that we are way too busy, nowadays, to
       comprehend everything in detail. And it's better to comprehend it
       dimly, through an interface, than not at all. Better for ten
       million Eloi to go on the Kilimanjaro Safari at Disney World than
       for a thousand cardiovascular surgeons and mutual fund managers to
       go on "real" ones in Kenya.
     * The boundary between these two classes is more porous than I've
       made it sound. I'm always running into regular dudes--construction
       workers, auto mechanics, taxi drivers, galoots in general--who were
       largely aliterate until something made it necessary for them to
       become readers and start actually thinking about things. Perhaps
       they had to come to grips with alcoholism, perhaps they got sent to
       jail, or came down with a disease, or suffered a crisis in
       religious faith, or simply got bored. Such people can get up to
       speed on particular subjects quite rapidly. Sometimes their lack of
       a broad education makes them over-apt to go off on intellectual
       wild goose chases, but, hey, at least a wild goose chase gives you
       some exercise.
     * The spectre of a polity controlled by the fads and whims of voters
       who actually believe that there are significant differences between
       Bud Lite and Miller Lite, and who think that professional wrestling
       is for real, is naturally alarming to people who don't. But then
       countries controlled via the command-line interface, as it were, by
       double-domed intellectuals, be they religious or secular, are
       generally miserable places to live.
     * Sophisticated people deride Disneyesque entertainments as pat and
       saccharine, but, hey, if the result of that is to instill basically
       warm and sympathetic reflexes, at a preverbal level, into hundreds
       of millions of unlettered media-steepers, then how bad can it be?
       We killed a lobster in our kitchen last night and my daughter cried
       for an hour. The Japanese, who used to be just about the fiercest
       people on earth, have become infatuated with cuddly adorable
       cartoon characters.
     * My own family--the people I know best--is divided about evenly
       between people who will probably read this essay and people who
       almost certainly won't, and I can't say for sure that one group is
       necessarily warmer, happier, or better-adjusted than the other.

     MORLOCKS AND ELOI AT THE KEYBOARD

     Back in the days of the command-line interface, users were all
     Morlocks who had to convert their thoughts into alphanumeric symbols
     and type them in, a grindingly tedious process that stripped away
     all ambiguity, laid bare all hidden assumptions, and cruelly
     punished laziness and imprecision. Then the interface-makers went to
     work on their GUIs, and introduced a new semiotic layer between
     people and machines. People who use such systems have abdicated the
     responsibility, and surrendered the power, of sending bits directly
     to the chip that's doing the arithmetic, and handed that
     responsibility and power over to the OS. This is tempting because
     giving clear instructions, to anyone or anything, is difficult. We
     cannot do it without thinking, and depending on the complexity of
     the situation, we may have to think hard about abstract things, and
     consider any number of ramifications, in order to do a good job of
     it. For most of us, this is hard work. We want things to be easier.
     How badly we want it can be measured by the size of Bill Gates's
     fortune.

     The OS has (therefore) become a sort of intellectual labor-saving
     device that tries to translate humans' vaguely expressed intentions
     into bits. In effect we are asking our computers to shoulder
     responsibilities that have always been considered the province of
     human beings--we want them to understand our desires, to anticipate
     our needs, to foresee consequences, to make connections, to handle
     routine chores without being asked, to remind us of what we ought to
     be reminded of while filtering out noise.

     At the upper (which is to say, closer to the user) levels, this is
     done through a set of conventions--menus, buttons, and so on. These
     work in the sense that analogies work: they help Eloi understand
     abstract or unfamiliar concepts by likening them to something known.
     But the loftier word "metaphor" is used.

     The overarching concept of the MacOS was the "desktop metaphor" and
     it subsumed any number of lesser (and frequently conflicting, or at
     least mixed) metaphors. Under a GUI, a file (frequently called
     "document") is metaphrased as a window on the screen (which is
     called a "desktop"). The window is almost always too small to
     contain the document and so you "move around," or, more
     pretentiously, "navigate" in the document by "clicking and dragging"
     the "thumb" on the "scroll bar." When you "type" (using a keyboard)
     or "draw" (using a "mouse") into the "window" or use pull-down
     "menus" and "dialog boxes" to manipulate its contents, the results
     of your labors get stored (at least in theory) in a "file," and
     later you can pull the same information back up into another
     "window." When you don't want it anymore, you "drag" it into the
     "trash."

     There is massively promiscuous metaphor-mixing going on here, and I
     could deconstruct it 'til the cows come home, but I won't. Consider
     only one word: "document." When we document something in the real
     world, we make fixed, permanent, immutable records of it. But
     computer documents are volatile, ephemeral constellations of data.
     Sometimes (as when you've just opened or saved them) the document as
     portrayed in the window is identical to what is stored, under the
     same name, in a file on the disk, but other times (as when you have
     made changes without saving them) it is completely different. In any
     case, every time you hit "Save" you annihilate the previous version
     of the "document" and replace it with whatever happens to be in the
     window at the moment. So even the word "save" is being used in a
     sense that is grotesquely misleading---"destroy one version, save
     another" would be more accurate.

     Anyone who uses a word processor for very long inevitably has the
     experience of putting hours of work into a long document and then
     losing it because the computer crashes or the power goes out. Until
     the moment that it disappears from the screen, the document seems
     every bit as solid and real as if it had been typed out in ink on
     paper. But in the next moment, without warning, it is completely and
     irretrievably gone, as if it had never existed. The user is left
     with a feeling of disorientation (to say nothing of annoyance)
     stemming from a kind of metaphor shear--you realize that you've been
     living and thinking inside of a metaphor that is essentially bogus.

     So GUIs use metaphors to make computing easier, but they are bad
     metaphors. Learning to use them is essentially a word game, a
     process of learning new definitions of words like "window" and
     "document" and "save" that are different from, and in many cases
     almost diametrically opposed to, the old. Somewhat improbably, this
     has worked very well, at least from a commercial standpoint, which
     is to say that Apple/Microsoft have made a lot of money off of it.
     All of the other modern operating systems have learned that in order
     to be accepted by users they must conceal their underlying gutwork
     beneath the same sort of spackle. This has some advantages: if you
     know how to use one GUI operating system, you can probably work out
     how to use any other in a few minutes. Everything works a little
     differently, like European plumbing--but with some fiddling around,
     you can type a memo or surf the web.

     Most people who shop for OSes (if they bother to shop at all) are
     comparing not the underlying functions but the superficial look and
     feel. The average buyer of an OS is not really paying for, and is
     not especially interested in, the low-level code that allocates
     memory or writes bytes onto the disk. What we're really buying is a
     system of metaphors. And--much more important--what we're buying
     into is the underlying assumption that metaphors are a good way to
     deal with the world.

     Recently a lot of new hardware has become available that gives
     computers numerous interesting ways of affecting the real world:
     making paper spew out of printers, causing words to appear on
     screens thousands of miles away, shooting beams of radiation through
     cancer patients, creating realistic moving pictures of the Titanic.
     Windows is now used as an OS for cash registers and bank tellers'
     terminals. My satellite TV system uses a sort of GUI to change
     channels and show program guides. Modern cellular telephones have a
     crude GUI built into a tiny LCD screen. Even Legos now have a GUI:
     you can buy a Lego set called Mindstorms that enables you to build
     little Lego robots and program them through a GUI on your computer.

     So we are now asking the GUI to do a lot more than serve as a
     glorified typewriter. Now we want to become a generalized tool for
     dealing with reality. This has become a bonanza for companies that
     make a living out of bringing new technology to the mass market.

     Obviously you cannot sell a complicated technological system to
     people without some sort of interface that enables them to use it.
     The internal combustion engine was a technological marvel in its
     day, but useless as a consumer good until a clutch, transmission,
     steering wheel and throttle were connected to it. That odd
     collection of gizmos, which survives to this day in every car on the
     road, made up what we would today call a user interface. But if cars
     had been invented after Macintoshes, carmakers would not have
     bothered to gin up all of these arcane devices. We would have a
     computer screen instead of a dashboard, and a mouse (or at best a
     joystick) instead of a steering wheel, and we'd shift gears by
     pulling down a menu:
PARK
---
REVERSE
---
NEUTRAL
----
3
2
1
---
Help...

     A few lines of computer code can thus be made to substitute for any
     imaginable mechanical interface. The problem is that in many cases
     the substitute is a poor one. Driving a car through a GUI would be a
     miserable experience. Even if the GUI were perfectly bug-free, it
     would be incredibly dangerous, because menus and buttons simply
     can't be as responsive as direct mechanical controls. My friend's
     dad, the gentleman who was restoring the MGB, never would have
     bothered with it if it had been equipped with a GUI. It wouldn't
     have been any fun.

     The steering wheel and gearshift lever were invented during an era
     when the most complicated technology in most homes was a butter
     churn. Those early carmakers were simply lucky, in that they could
     dream up whatever interface was best suited to the task of driving
     an automobile, and people would learn it. Likewise with the dial
     telephone and the AM radio. By the time of the Second World War,
     most people knew several interfaces: they could not only churn
     butter but also drive a car, dial a telephone, turn on a radio,
     summon flame from a cigarette lighter, and change a light bulb.

     But now every little thing--wristwatches, VCRs, stoves--is jammed
     with features, and every feature is useless without an interface. If
     you are like me, and like most other consumers, you have never used
     ninety percent of the available features on your microwave oven,
     VCR, or cellphone. You don't even know that these features exist.
     The small benefit they might bring you is outweighed by the sheer
     hassle of having to learn about them. This has got to be a big
     problem for makers of consumer goods, because they can't compete
     without offering features.

     It's no longer acceptable for engineers to invent a wholly novel
     user interface for every new product, as they did in the case of the
     automobile, partly because it's too expensive and partly because
     ordinary people can only learn so much. If the VCR had been invented
     a hundred years ago, it would have come with a thumbwheel to adjust
     the tracking and a gearshift to change between forward and reverse
     and a big cast-iron handle to load or to eject the cassettes. It
     would have had a big analog clock on the front of it, and you would
     have set the time by moving the hands around on the dial. But
     because the VCR was invented when it was--during a sort of awkward
     transitional period between the era of mechanical interfaces and
     GUIs--it just had a bunch of pushbuttons on the front, and in order
     to set the time you had to push the buttons in just the right way.
     This must have seemed reasonable enough to the engineers responsible
     for it, but to many users it was simply impossible. Thus the famous
     blinking 12:00 that appears on so many VCRs. Computer people call
     this "the blinking twelve problem". When they talk about it, though,
     they usually aren't talking about VCRs.

     Modern VCRs usually have some kind of on-screen programming, which
     means that you can set the time and control other features through a
     sort of primitive GUI. GUIs have virtual pushbuttons too, of course,
     but they also have other types of virtual controls, like radio
     buttons, checkboxes, text entry boxes, dials, and scrollbars.
     Interfaces made out of these components seem to be a lot easier, for
     many people, than pushing those little buttons on the front of the
     machine, and so the blinking 12:00 itself is slowly disappearing
     from America's living rooms. The blinking twelve problem has moved
     on to plague other technologies.

     So the GUI has gone beyond being an interface to personal computers,
     and become a sort of meta-interface that is pressed into service for
     every new piece of consumer technology. It is rarely an ideal fit,
     but having an ideal, or even a good interface is no longer the
     priority; the important thing now is having some kind of interface
     that customers will actually use, so that manufacturers can claim,
     with a straight face, that they are offering new features.

     We want GUIs largely because they are convenient and because they
     are easy-- or at least the GUI makes it seem that way. Of course,
     nothing is really easy and simple, and putting a nice interface on
     top of it does not change that fact. A car controlled through a GUI
     would be easier to drive than one controlled through pedals and
     steering wheel, but it would be incredibly dangerous.

     By using GUIs all the time we have insensibly bought into a premise
     that few people would have accepted if it were presented to them
     bluntly: namely, that hard things can be made easy, and complicated
     things simple, by putting the right interface on them. In order to
     understand how bizarre this is, imagine that book reviews were
     written according to the same values system that we apply to user
     interfaces: "The writing in this book is marvelously simple-minded
     and glib; the author glosses over complicated subjects and employs
     facile generalizations in almost every sentence. Readers rarely have
     to think, and are spared all of the difficulty and tedium typically
     involved in reading old-fashioned books." As long as we stick to
     simple operations like setting the clocks on our VCRs, this is not
     so bad. But as we try to do more ambitious things with our
     technologies, we inevitably run into the problem of:

     METAPHOR SHEAR

     I began using Microsoft Word as soon as the first version was
     released around 1985. After some initial hassles I found it to be a
     better tool than MacWrite, which was its only competition at the
     time. I wrote a lot of stuff in early versions of Word, storing it
     all on floppies, and transferred the contents of all my floppies to
     my first hard drive, which I acquired around 1987. As new versions
     of Word came out I faithfully upgraded, reasoning that as a writer
     it made sense for me to spend a certain amount of money on tools.

     Sometime in the mid-1980's I attempted to open one of my old,
     circa-1985 Word documents using the version of Word then current:
     6.0 It didn't work. Word 6.0 did not recognize a document created by
     an earlier version of itself. By opening it as a text file, I was
     able to recover the sequences of letters that made up the text of
     the document. My words were still there. But the formatting had been
     run through a log chipper--the words I'd written were interrupted by
     spates of empty rectangular boxes and gibberish.

     Now, in the context of a business (the chief market for Word) this
     sort of thing is only an annoyance--one of the routine hassles that
     go along with using computers. It's easy to buy little file
     converter programs that will take care of this problem. But if you
     are a writer whose career is words, whose professional identity is a
     corpus of written documents, this kind of thing is extremely
     disquieting. There are very few fixed assumptions in my line of
     work, but one of them is that once you have written a word, it is
     written, and cannot be unwritten. The ink stains the paper, the
     chisel cuts the stone, the stylus marks the clay, and something has
     irrevocably happened (my brother-in-law is a theologian who reads
     3250-year-old cuneiform tablets--he can recognize the handwriting of
     particular scribes, and identify them by name). But word-processing
     software--particularly the sort that employs special, complex file
     formats--has the eldritch power to unwrite things. A small change in
     file formats, or a few twiddled bits, and months' or years' literary
     output can cease to exist.

     Now this was technically a fault in the application (Word 6.0 for
     the Macintosh) not the operating system (MacOS 7 point something)
     and so the initial target of my annoyance was the people who were
     responsible for Word. But. On the other hand, I could have chosen
     the "save as text" option in Word and saved all of my documents as
     simple telegrams, and this problem would not have arisen. Instead I
     had allowed myself to be seduced by all of those flashy formatting
     options that hadn't even existed until GUIs had come along to make
     them practicable. I had gotten into the habit of using them to make
     my documents look pretty (perhaps prettier than they deserved to
     look; all of the old documents on those floppies turned out to be
     more or less crap). Now I was paying the price for that
     self-indulgence. Technology had moved on and found ways to make my
     documents look even prettier, and the consequence of it was that all
     old ugly documents had ceased to exist.

     It was--if you'll pardon me for a moment's strange little
     fantasy--as if I'd gone to stay at some resort, some exquisitely
     designed and art-directed hotel, placing myself in the hands of past
     masters of the Sensorial Interface, and had sat down in my room and
     written a story in ballpoint pen on a yellow legal pad, and when I
     returned from dinner, discovered that the maid had taken my work
     away and left behind in its place a quill pen and a stack of fine
     parchment--explaining that the room looked ever so much finer this
     way, and it was all part of a routine upgrade. But written on these
     sheets of paper, in flawless penmanship, were long sequences of
     words chosen at random from the dictionary. Appalling, sure, but I
     couldn't really lodge a complaint with the management, because by
     staying at this resort I had given my consent to it. I had
     surrendered my Morlock credentials and become an Eloi.

     LINUX

     During the late 1980's and early 1990's I spent a lot of time
     programming Macintoshes, and eventually decided for fork over
     several hundred dollars for an Apple product called the Macintosh
     Programmer's Workshop, or MPW. MPW had competitors, but it was
     unquestionably the premier software development system for the Mac.
     It was what Apple's own engineers used to write Macintosh code.
     Given that MacOS was far more technologically advanced, at the time,
     than its competition, and that Linux did not even exist yet, and
     given that this was the actual program used by Apple's world-class
     team of creative engineers, I had high expectations. It arrived on a
     stack of floppy disks about a foot high, and so there was plenty of
     time for my excitement to build during the endless installation
     process. The first time I launched MPW, I was probably expecting
     some kind of touch-feely multimedia showcase. Instead it was
     austere, almost to the point of being intimidating. It was a
     scrolling window into which you could type simple, unformatted text.
     The system would then interpret these lines of text as commands, and
     try to execute them.

     It was, in other words, a glass teletype running a command line
     interface. It came with all sorts of cryptic but powerful commands,
     which could be invoked by typing their names, and which I learned to
     use only gradually. It was not until a few years later, when I began
     messing around with Unix, that I understood that the command line
     interface embodied in MPW was a re-creation of Unix.

     In other words, the first thing that Apple's hackers had done when
     they'd got the MacOS up and running--probably even before they'd
     gotten it up and running--was to re-create the Unix interface, so
     that they would be able to get some useful work done. At the time, I
     simply couldn't get my mind around this, but: as far as Apple's
     hackers were concerned, the Mac's vaunted Graphical User Interface
     was an impediment, something to be circumvented before the little
     toaster even came out onto the market.

     Even before my Powerbook crashed and obliterated my big file in July
     1995, there had been danger signs. An old college buddy of mine, who
     starts and runs high-tech companies in Boston, had developed a
     commercial product using Macintoshes as the front end. Basically the
     Macs were high-performance graphics terminals, chosen for their
     sweet user interface, giving users access to a large database of
     graphical information stored on a network of much more powerful, but
     less user-friendly, computers. This fellow was the second person who
     turned me on to Macintoshes, by the way, and through the mid-1980's
     we had shared the thrill of being high-tech cognoscenti, using
     superior Apple technology in a world of DOS-using knuckleheads.
     Early versions of my friend's system had worked well, he told me,
     but when several machines joined the network, mysterious crashes
     began to occur; sometimes the whole network would just freeze. It
     was one of those bugs that could not be reproduced easily. Finally
     they figured out that these network crashes were triggered whenever
     a user, scanning the menus for a particular item, held down the
     mouse button for more than a couple of seconds.

     Fundamentally, the MacOS could only do one thing at a time. Drawing
     a menu on the screen is one thing. So when a menu was pulled down,
     the Macintosh was not capable of doing anything else until that
     indecisive user released the button.

     This is not such a bad thing in a single-user, single-process
     machine (although it's a fairly bad thing), but it's no good in a
     machine that is on a network, because being on a network implies
     some kind of continual low-level interaction with other machines. By
     failing to respond to the network, the Mac caused a network-wide
     crash.

     In order to work with other computers, and with networks, and with
     various different types of hardware, an OS must be incomparably more
     complicated and powerful than either MS-DOS or the original MacOS.
     The only way of connecting to the Internet that's worth taking
     seriously is PPP, the Point-to-Point Protocol, which (never mind the
     details) makes your computer--temporarily--a full-fledged member of
     the Global Internet, with its own unique address, and various
     privileges, powers, and responsibilities appertaining thereunto.
     Technically it means your machine is running the TCP/IP protocol,
     which, to make a long story short, revolves around sending packets
     of data back and forth, in no particular order, and at unpredictable
     times, according to a clever and elegant set of rules. But sending a
     packet of data is one thing, and so an OS that can only do one thing
     at a time cannot simultaneously be part of the Internet and do
     anything else. When TCP/IP was invented, running it was an honor
     reserved for Serious Computers--mainframes and high-powered
     minicomputers used in technical and commercial settings--and so the
     protocol is engineered around the assumption that every computer
     using it is a serious machine, capable of doing many things at once.
     Not to put too fine a point on it, a Unix machine. Neither MacOS nor
     MS-DOS was originally built with that in mind, and so when the
     Internet got hot, radical changes had to be made.

     When my Powerbook broke my heart, and when Word stopped recognizing
     my old files, I jumped to Unix. The obvious alternative to MacOS
     would have been Windows. I didn't really have anything against
     Microsoft, or Windows. But it was pretty obvious, now, that old PC
     operating systems were overreaching, and showing the strain, and,
     perhaps, were best avoided until they had learned to walk and chew
     gum at the same time.

     The changeover took place on a particular day in the summer of 1995.
     I had been San Francisco for a couple of weeks, using my PowerBook
     to work on a document. The document was too big to fit onto a single
     floppy, and so I hadn't made a backup since leaving home. The
     PowerBook crashed and wiped out the entire file.

     It happened just as I was on my way out the door to visit a company
     called Electric Communities, which in those days was in Los Altos. I
     took my PowerBook with me. My friends at Electric Communities were
     Mac users who had all sorts of utility software for unerasing files
     and recovering from disk crashes, and I was certain I could get most
     of the file back.

     As it turned out, two different Mac crash recovery utilities were
     unable to find any trace that my file had ever existed. It was
     completely and systematically wiped out. We went through that hard
     disk block by block and found disjointed fragments of countless old,
     discarded, forgotten files, but none of what I wanted. The metaphor
     shear was especially brutal that day. It was sort of like watching
     the girl you've been in love with for ten years get killed in a car
     wreck, and then attending her autopsy, and learning that underneath
     the clothes and makeup she was just flesh and blood.

     I must have been reeling around the offices of Electric Communities
     in some kind of primal Jungian fugue, because at this moment three
     weirdly synchronistic things happened.

     (1) Randy Farmer, a co-founder of the company, came in for a quick
     visit along with his family--he was recovering from back surgery at
     the time. He had some hot gossip: "Windows 95 mastered today." What
     this meant was that Microsoft's new operating system had, on this
     day, been placed on a special compact disk known as a golden master,
     which would be used to stamp out a jintillion copies in preparation
     for its thunderous release a few weeks later. This news was received
     peevishly by the staff of Electric Communities, including one whose
     office door was plastered with the usual assortment of cartoons and
     novelties, e.g.

     (2) a copy of a Dilbert cartoon in which Dilbert, the long-suffering
     corporate software engineer, encounters a portly, bearded, hairy man
     of a certain age--a bit like Santa Claus, but darker, with a certain
     edge about him. Dilbert recognizes this man, based upon his
     appearance and affect, as a Unix hacker, and reacts with a certain
     mixture of nervousness, awe, and hostility. Dilbert jabs weakly at
     the disturbing interloper for a couple of frames; the Unix hacker
     listens with a kind of infuriating, beatific calm, then, in the last
     frame, reaches into his pocket. "Here's a nickel, kid," he says, "go
     buy yourself a real computer."

     (3) the owner of the door, and the cartoon, was one Doug Barnes.
     Barnes was known to harbor certain heretical opinions on the subject
     of operating systems. Unlike most Bay Area techies who revered the
     Macintosh, considering it to be a true hacker's machine, Barnes was
     fond of pointing out that the Mac, with its hermetically sealed
     architecture, was actually hostile to hackers, who are prone to
     tinkering and dogmatic about openness. By contrast, the
     IBM-compatible line of machines, which can easily be taken apart and
     plugged back together, was much more hackable.

     So when I got home I began messing around with Linux, which is one
     of many, many different concrete implementations of the abstract,
     Platonic ideal called Unix. I was not looking forward to changing
     over to a new OS, because my credit cards were still smoking from
     all the money I'd spent on Mac hardware over the years. But Linux's
     great virtue was, and is, that it would run on exactly the same sort
     of hardware as the Microsoft OSes--which is to say, the cheapest
     hardware in existence. As if to demonstrate why this was a great
     idea, I was, within a week or two of returning home, able to get my
     hand on a then-decent computer (a 33-MHz 486 box) for free, because
     I knew a guy who worked in an office where they were simply being
     thrown away. Once I got it home, I yanked the hood off, stuck my
     hands in, and began switching cards around. If something didn't
     work, I went to a used-computer outlet and pawed through a bin full
     of components and bought a new card for a few bucks.

     The availability of all this cheap but effective hardware was an
     unintended consequence of decisions that had been made more than a
     decade earlier by IBM and Microsoft. When Windows came out, and
     brought the GUI to a much larger market, the hardware regime
     changed: the cost of color video cards and high-resolution monitors
     began to drop, and is dropping still. This free-for-all approach to
     hardware meant that Windows was unavoidably clunky compared to
     MacOS. But the GUI brought computing to such a vast audience that
     volume went way up and prices collapsed. Meanwhile Apple, which so
     badly wanted a clean, integrated OS with video neatly integrated
     into processing hardware, had fallen far behind in market share, at
     least partly because their beautiful hardware cost so much.

     But the price that we Mac owners had to pay for superior aesthetics
     and engineering was not merely a financial one. There was a cultural
     price too, stemming from the fact that we couldn't open up the hood
     and mess around with it. Doug Barnes was right. Apple, in spite of
     its reputation as the machine of choice of scruffy, creative hacker
     types, had actually created a machine that discouraged hacking,
     while Microsoft, viewed as a technological laggard and copycat, had
     created a vast, disorderly parts bazaar--a primordial soup that
     eventually self-assembled into Linux.

     THE HOLE HAWG OF OPERATING SYSTEMS

     Unix has always lurked provocatively in the background of the
     operating system wars, like the Russian Army. Most people know it
     only by reputation, and its reputation, as the Dilbert cartoon
     suggests, is mixed. But everyone seems to agree that if it could
     only get its act together and stop surrendering vast tracts of rich
     agricultural land and hundreds of thousands of prisoners of war to
     the onrushing invaders, it could stomp them (and all other
     opposition) flat.

     It is difficult to explain how Unix has earned this respect without
     going into mind-smashing technical detail. Perhaps the gist of it
     can be explained by telling a story about drills.

     The Hole Hawg is a drill made by the Milwaukee Tool Company. If you
     look in a typical hardware store you may find smaller Milwaukee
     drills but not the Hole Hawg, which is too powerful and too
     expensive for homeowners. The Hole Hawg does not have the
     pistol-like design of a cheap homeowner's drill. It is a cube of
     solid metal with a handle sticking out of one face and a chuck
     mounted in another. The cube contains a disconcertingly potent
     electric motor. You can hold the handle and operate the trigger with
     your index finger, but unless you are exceptionally strong you
     cannot control the weight of the Hole Hawg with one hand; it is a
     two-hander all the way. In order to fight off the counter-torque of
     the Hole Hawg you use a separate handle (provided), which you screw
     into one side of the iron cube or the other depending on whether you
     are using your left or right hand to operate the trigger. This
     handle is not a sleek, ergonomically designed item as it would be in
     a homeowner's drill. It is simply a foot-long chunk of regular
     galvanized pipe, threaded on one end, with a black rubber handle on
     the other. If you lose it, you just go to the local plumbing supply
     store and buy another chunk of pipe.

     During the Eighties I did some construction work. One day, another
     worker leaned a ladder against the outside of the building that we
     were putting up, climbed up to the second-story level, and used the
     Hole Hawg to drill a hole through the exterior wall. At some point,
     the drill bit caught in the wall. The Hole Hawg, following its one
     and only imperative, kept going. It spun the worker's body around
     like a rag doll, causing him to knock his own ladder down.
     Fortunately he kept his grip on the Hole Hawg, which remained lodged
     in the wall, and he simply dangled from it and shouted for help
     until someone came along and reinstated the ladder.

     I myself used a Hole Hawg to drill many holes through studs, which
     it did as a blender chops cabbage. I also used it to cut a few
     six-inch-diameter holes through an old lath-and-plaster ceiling. I
     chucked in a new hole saw, went up to the second story, reached down
     between the newly installed floor joists, and began to cut through
     the first-floor ceiling below. Where my homeowner's drill had
     labored and whined to spin the huge bit around, and had stalled at
     the slightest obstruction, the Hole Hawg rotated with the stupid
     consistency of a spinning planet. When the hole saw seized up, the
     Hole Hawg spun itself and me around, and crushed one of my hands
     between the steel pipe handle and a joist, producing a few
     lacerations, each surrounded by a wide corona of deeply bruised
     flesh. It also bent the hole saw itself, though not so badly that I
     couldn't use it. After a few such run-ins, when I got ready to use
     the Hole Hawg my heart actually began to pound with atavistic
     terror.

     But I never blamed the Hole Hawg; I blamed myself. The Hole Hawg is
     dangerous because it does exactly what you tell it to. It is not
     bound by the physical limitations that are inherent in a cheap
     drill, and neither is it limited by safety interlocks that might be
     built into a homeowner's product by a liability-conscious
     manufacturer. The danger lies not in the machine itself but in the
     user's failure to envision the full consequences of the instructions
     he gives to it.

     A smaller tool is dangerous too, but for a completely different
     reason: it tries to do what you tell it to, and fails in some way
     that is unpredictable and almost always undesirable. But the Hole
     Hawg is like the genie of the ancient fairy tales, who carries out
     his master's instructions literally and precisely and with unlimited
     power, often with disastrous, unforeseen consequences.

     Pre-Hole Hawg, I used to examine the drill selection in hardware
     stores with what I thought was a judicious eye, scorning the smaller
     low-end models and hefting the big expensive ones appreciatively,
     wishing I could afford one of them babies. Now I view them all with
     such contempt that I do not even consider them to be real
     drills--merely scaled-up toys designed to exploit the
     self-delusional tendencies of soft-handed homeowners who want to
     believe that they have purchased an actual tool. Their plastic
     casings, carefully designed and focus-group-tested to convey a
     feeling of solidity and power, seem disgustingly flimsy and cheap to
     me, and I am ashamed that I was ever bamboozled into buying such
     knicknacks.

     It is not hard to imagine what the world would look like to someone
     who had been raised by contractors and who had never used any drill
     other than a Hole Hawg. Such a person, presented with the best and
     most expensive hardware-store drill, would not even recognize it as
     such. He might instead misidentify it as a child's toy, or some kind
     of motorized screwdriver. If a salesperson or a deluded homeowner
     referred to it as a drill, he would laugh and tell them that they
     were mistaken--they simply had their terminology wrong. His
     interlocutor would go away irritated, and probably feeling rather
     defensive about his basement full of cheap, dangerous, flashy,
     colorful tools.

     Unix is the Hole Hawg of operating systems, and Unix hackers, like
     Doug Barnes and the guy in the Dilbert cartoon and many of the other
     people who populate Silicon Valley, are like contractor's sons who
     grew up using only Hole Hawgs. They might use Apple/Microsoft OSes
     to write letters, play video games, or balance their checkbooks, but
     they cannot really bring themselves to take these operating systems
     seriously.

     THE ORAL TRADITION

     Unix is hard to learn. The process of learning it is one of multiple
     small epiphanies. Typically you are just on the verge of inventing
     some necessary tool or utility when you realize that someone else
     has already invented it, and built it in, and this explains some odd
     file or directory or command that you have noticed but never really
     understood before.

     For example there is a command (a small program, part of the OS)
     called whoami, which enables you to ask the computer who it thinks
     you are. On a Unix machine, you are always logged in under some
     name--possibly even your own! What files you may work with, and what
     software you may use, depends on your identity. When I started out
     using Linux, I was on a non-networked machine in my basement, with
     only one user account, and so when I became aware of the whoami
     command it struck me as ludicrous. But once you are logged in as one
     person, you can temporarily switch over to a pseudonym in order to
     access different files. If your machine is on the Internet, you can
     log onto other computers, provided you have a user name and a
     password. At that point the distant machine becomes no different in
     practice from the one right in front of you. These changes in
     identity and location can easily become nested inside each other,
     many layers deep, even if you aren't doing anything nefarious. Once
     you have forgotten who and where you are, the whoami command is
     indispensible. I use it all the time.

     The file systems of Unix machines all have the same general
     structure. On your flimsy operating systems, you can create
     directories (folders) and give them names like Frodo or My Stuff and
     put them pretty much anywhere you like. But under Unix the highest
     level--the root--of the filesystem is always designated with the
     single character "/" and it always contains the same set of
     top-level directories:
/usr
/etc
/var
/bin
/proc
/boot
/home
/root
/sbin
/dev
/lib
/tmp

     and each of these directories typically has its own distinct
     structure of subdirectories. Note the obsessive use of abbreviations
     and avoidance of capital letters; this is a system invented by
     people to whom repetitive stress disorder is what black lung is to
     miners. Long names get worn down to three-letter nubbins, like
     stones smoothed by a river.

     This is not the place to try to explain why each of the above
     directories exists, and what is contained in it. At first it all
     seems obscure; worse, it seems deliberately obscure. When I started
     using Linux I was accustomed to being able to create directories
     wherever I wanted and to give them whatever names struck my fancy.
     Under Unix you are free to do that, of course (you are free to do
     anything) but as you gain experience with the system you come to
     understand that the directories listed above were created for the
     best of reasons and that your life will be much easier if you follow
     along (within /home, by the way, you have pretty much unlimited
     freedom).

     After this kind of thing has happened several hundred or thousand
     times, the hacker understands why Unix is the way it is, and agrees
     that it wouldn't be the same any other way. It is this sort of
     acculturation that gives Unix hackers their confidence in the
     system, and the attitude of calm, unshakable, annoying superiority
     captured in the Dilbert cartoon. Windows 95 and MacOS are products,
     contrived by engineers in the service of specific companies. Unix,
     by contrast, is not so much a product as it is a painstakingly
     compiled oral history of the hacker subculture. It is our Gilgamesh
     epic.

     What made old epics like Gilgamesh so powerful and so long-lived was
     that they were living bodies of narrative that many people knew by
     heart, and told over and over again--making their own personal
     embellishments whenever it struck their fancy. The bad
     embellishments were shouted down, the good ones picked up by others,
     polished, improved, and, over time, incorporated into the story.
     Likewise, Unix is known, loved, and understood by so many hackers
     that it can be re-created from scratch whenever someone needs it.
     This is very difficult to understand for people who are accustomed
     to thinking of OSes as things that absolutely have to be bought.

     Many hackers have launched more or less successful
     re-implementations of the Unix ideal. Each one brings in new
     embellishments. Some of them die out quickly, some are merged with
     similar, parallel innovations created by different hackers attacking
     the same problem, others still are embraced, and adopted into the
     epic. Thus Unix has slowly accreted around a simple kernel and
     acquired a kind of complexity and asymmetry about it that is
     organic, like the roots of a tree, or the branchings of a coronary
     artery. Understanding it is more like anatomy than physics.

     For at least a year, prior to my adoption of Linux, I had been
     hearing about it. Credible, well-informed people kept telling me
     that a bunch of hackers had got together an implentation of Unix
     that could be downloaded, free of charge, from the Internet. For a
     long time I could not bring myself to take the notion seriously. It
     was like hearing rumors that a group of model rocket enthusiasts had
     created a completely functional Saturn V by exchanging blueprints on
     the Net and mailing valves and flanges to each other.

     But it's true. Credit for Linux generally goes to its human
     namesake, one Linus Torvalds, a Finn who got the whole thing rolling
     in 1991 when he used some of the GNU tools to write the beginnings
     of a Unix kernel that could run on PC-compatible hardware. And
     indeed Torvalds deserves all the credit he has ever gotten, and a
     whole lot more. But he could not have made it happen by himself, any
     more than Richard Stallman could have. To write code at all,
     Torvalds had to have cheap but powerful development tools, and these
     he got from Stallman's GNU project.

     And he had to have cheap hardware on which to write that code. Cheap
     hardware is a much harder thing to arrange than cheap software; a
     single person (Stallman) can write software and put it up on the Net
     for free, but in order to make hardware it's necessary to have a
     whole industrial infrastructure, which is not cheap by any stretch
     of the imagination. Really the only way to make hardware cheap is to
     punch out an incredible number of copies of it, so that the unit
     cost eventually drops. For reasons already explained, Apple had no
     desire to see the cost of hardware drop. The only reason Torvalds
     had cheap hardware was Microsoft.

     Microsoft refused to go into the hardware business, insisted on
     making its software run on hardware that anyone could build, and
     thereby created the market conditions that allowed hardware prices
     to plummet. In trying to understand the Linux phenomenon, then, we
     have to look not to a single innovator but to a sort of bizarre
     Trinity: Linus Torvalds, Richard Stallman, and Bill Gates. Take away
     any of these three and Linux would not exist.

     OS SHOCK

     Young Americans who leave their great big homogeneous country and
     visit some other part of the world typically go through several
     stages of culture shock: first, dumb wide-eyed astonishment. Then a
     tentative engagement with the new country's manners, cuisine, public
     transit systems and toilets, leading to a brief period of fatuous
     confidence that they are instant experts on the new country. As the
     visit wears on, homesickness begins to set in, and the traveler
     begins to appreciate, for the first time, how much he or she took
     for granted at home. At the same time it begins to seem obvious that
     many of one's own cultures and traditions are essentially arbitrary,
     and could have been different; driving on the right side of the
     road, for example. When the traveler returns home and takes stock of
     the experience, he or she may have learned a good deal more about
     America than about the country they went to visit.

     For the same reasons, Linux is worth trying. It is a strange country
     indeed, but you don't have to live there; a brief sojourn suffices
     to give some flavor of the place and--more importantly--to lay bare
     everything that is taken for granted, and all that could have been
     done differently, under Windows or MacOS.

     You can't try it unless you install it. With any other OS,
     installing it would be a straightforward transaction: in exchange
     for money, some company would give you a CD-ROM, and you would be on
     your way. But a lot is subsumed in that kind of transaction, and has
     to be gone through and picked apart.

     We like plain dealings and straightforward transactions in America.
     If you go to Egypt and, say, take a taxi somewhere, you become a
     part of the taxi driver's life; he refuses to take your money
     because it would demean your friendship, he follows you around town,
     and weeps hot tears when you get in some other guy's taxi. You end
     up meeting his kids at some point, and have to devote all sort of
     ingenuity to finding some way to compensate him without insulting
     his honor. It is exhausting. Sometimes you just want a simple
     Manhattan-style taxi ride.

     But in order to have an American-style setup, where you can just go
     out and hail a taxi and be on your way, there must exist a whole
     hidden apparatus of medallions, inspectors, commissions, and so
     forth--which is fine as long as taxis are cheap and you can always
     get one. When the system fails to work in some way, it is mysterious
     and infuriating and turns otherwise reasonable people into
     conspiracy theorists. But when the Egyptian system breaks down, it
     breaks down transparently. You can't get a taxi, but your driver's
     nephew will show up, on foot, to explain the problem and apologize.

     Microsoft and Apple do things the Manhattan way, with vast
     complexity hidden behind a wall of interface. Linux does things the
     Egypt way, with vast complexity strewn about all over the landscape.
     If you've just flown in from Manhattan, your first impulse will be
     to throw up your hands and say "For crying out loud! Will you people
     get a grip on yourselves!?" But this does not make friends in
     Linux-land any better than it would in Egypt.

     You can suck Linux right out of the air, as it were, by downloading
     the right files and putting them in the right places, but there
     probably are not more than a few hundred people in the world who
     could create a functioning Linux system in that way. What you really
     need is a distribution of Linux, which means a prepackaged set of
     files. But distributions are a separate thing from Linux per se.

     Linux per se is not a specific set of ones and zeroes, but a
     self-organizing Net subculture. The end result of its collective
     lucubrations is a vast body of source code, almost all written in C
     (the dominant computer programming language). "Source code" just
     means a computer program as typed in and edited by some hacker. If
     it's in C, the file name will probably have .c or .cpp on the end of
     it, depending on which dialect was used; if it's in some other
     language it will have some other suffix. Frequently these sorts of
     files can be found in a directory with the name /src which is the
     hacker's Hebraic abbreviation of "source."

     Source files are useless to your computer, and of little interest to
     most users, but they are of gigantic cultural and political
     significance, because Microsoft and Apple keep them secret while
     Linux makes them public. They are the family jewels. They are the
     sort of thing that in Hollywood thrillers is used as a McGuffin: the
     plutonium bomb core, the top-secret blueprints, the suitcase of
     bearer bonds, the reel of microfilm. If the source files for Windows
     or MacOS were made public on the Net, then those OSes would become
     free, like Linux--only not as good, because no one would be around
     to fix bugs and answer questions. Linux is "open source" software
     meaning, simply, that anyone can get copies of its source code
     files.

     Your computer doesn't want source code any more than you do; it
     wants object code. Object code files typically have the suffix .o
     and are unreadable all but a few, highly strange humans, because
     they consist of ones and zeroes. Accordingly, this sort of file
     commonly shows up in a directory with the name /bin, for "binary."

     Source files are simply ASCII text files. ASCII denotes a particular
     way of encoding letters into bit patterns. In an ASCII file, each
     character has eight bits all to itself. This creates a potential
     "alphabet" of 256 distinct characters, in that eight binary digits
     can form that many unique patterns. In practice, of course, we tend
     to limit ourselves to the familiar letters and digits. The
     bit-patterns used to represent those letters and digits are the same
     ones that were physically punched into the paper tape by my high
     school teletype, which in turn were the same one used by the
     telegraph industry for decades previously. ASCII text files, in
     other words, are telegrams, and as such they have no typographical
     frills. But for the same reason they are eternal, because the code
     never changes, and universal, because every text editing and word
     processing software ever written knows about this code.

     Therefore just about any software can be used to create, edit, and
     read source code files. Object code files, then, are created from
     these source files by a piece of software called a compiler, and
     forged into a working application by another piece of software
     called a linker.

     The triad of editor, compiler, and linker, taken together, form the
     core of a software development system. Now, it is possible to spend
     a lot of money on shrink-wrapped development systems with lovely
     graphical user interfaces and various ergonomic enhancements. In
     some cases it might even be a good and reasonable way to spend
     money. But on this side of the road, as it were, the very best
     software is usually the free stuff. Editor, compiler and linker are
     to hackers what ponies, stirrups, and archery sets were to the
     Mongols. Hackers live in the saddle, and hack on their own tools
     even while they are using them to create new applications. It is
     quite inconceivable that superior hacking tools could have been
     created from a blank sheet of paper by product engineers. Even if
     they are the brightest engineers in the world they are simply
     outnumbered.

     In the GNU/Linux world there are two major text editing programs:
     the minimalist vi (known in some implementations as elvis) and the
     maximalist emacs. I use emacs, which might be thought of as a
     thermonuclear word processor. It was created by Richard Stallman;
     enough said. It is written in Lisp, which is the only computer
     language that is beautiful. It is colossal, and yet it only edits
     straight ASCII text files, which is to say, no fonts, no boldface,
     no underlining. In other words, the engineer-hours that, in the case
     of Microsoft Word, were devoted to features like mail merge, and the
     ability to embed feature-length motion pictures in corporate
     memoranda, were, in the case of emacs, focused with maniacal
     intensity on the deceptively simple-seeming problem of editing text.
     If you are a professional writer--i.e., if someone else is getting
     paid to worry about how your words are formatted and printed--emacs
     outshines all other editing software in approximately the same way
     that the noonday sun does the stars. It is not just bigger and
     brighter; it simply makes everything else vanish. For page layout
     and printing you can use TeX: a vast corpus of typesetting lore
     written in C and also available on the Net for free.

     I could say a lot about emacs and TeX, but right now I am trying to
     tell a story about how to actually install Linux on your machine.
     The hard-core survivalist approach would be to download an editor
     like emacs, and the GNU Tools--the compiler and linker--which are
     polished and excellent to the same degree as emacs. Equipped with
     these, one would be able to start downloading ASCII source code
     files (/src) and compiling them into binary object code files (/bin)
     that would run on the machine. But in order to even arrive at this
     point--to get emacs running, for example--you have to have Linux
     actually up and running on your machine. And even a minimal Linux
     operating system requires thousands of binary files all acting in
     concert, and arranged and linked together just so.

     Several entities have therefore taken it upon themselves to create
     "distributions" of Linux. If I may extend the Egypt analogy
     slightly, these entities are a bit like tour guides who meet you at
     the airport, who speak your language, and who help guide you through
     the initial culture shock. If you are an Egyptian, of course, you
     see it the other way; tour guides exist to keep brutish outlanders
     from traipsing through your mosques and asking you the same
     questions over and over and over again.

     Some of these tour guides are commercial organizations, such as Red
     Hat Software, which makes a Linux distribution called Red Hat that
     has a relatively commercial sheen to it. In most cases you put a Red
     Hat CD-ROM into your PC and reboot and it handles the rest. Just as
     a tour guide in Egypt will expect some sort of compensation for his
     services, commercial distributions need to be paid for. In most
     cases they cost almost nothing and are well worth it.

     I use a distribution called Debian (the word is a contraction of
     "Deborah" and "Ian") which is non-commercial. It is organized (or
     perhaps I should say "it has organized itself") along the same lines
     as Linux in general, which is to say that it consists of volunteers
     who collaborate over the Net, each responsible for looking after a
     different chunk of the system. These people have broken Linux down
     into a number of packages, which are compressed files that can be
     downloaded to an already functioning Debian Linux system, then
     opened up and unpacked using a free installer application. Of
     course, as such, Debian has no commercial arm--no distribution
     mechanism. You can download all Debian packages over the Net, but
     most people will want to have them on a CD-ROM. Several different
     companies have taken it upon themselves to decoct all of the current
     Debian packages onto CD-ROMs and then sell them. I buy mine from
     Linux Systems Labs. The cost for a three-disc set, containing Debian
     in its entirety, is less than three dollars. But (and this is an
     important distinction) not a single penny of that three dollars is
     going to any of the coders who created Linux, nor to the Debian
     packagers. It goes to Linux Systems Labs and it pays, not for the
     software, or the packages, but for the cost of stamping out the
     CD-ROMs.

     Every Linux distribution embodies some more or less clever hack for
     circumventing the normal boot process and causing your computer,
     when it is turned on, to organize itself, not as a PC running
     Windows, but as a "host" running Unix. This is slightly alarming the
     first time you see it, but completely harmless. When a PC boots up,
     it goes through a little self-test routine, taking an inventory of
     available disks and memory, and then begins looking around for a
     disk to boot up from. In any normal Windows computer that disk will
     be a hard drive. But if you have your system configured right, it
     will look first for a floppy or CD-ROM disk, and boot from that if
     one is available.

     Linux exploits this chink in the defenses. Your computer notices a
     bootable disk in the floppy or CD-ROM drive, loads in some object
     code from that disk, and blindly begins to execute it. But this is
     not Microsoft or Apple code, this is Linux code, and so at this
     point your computer begins to behave very differently from what you
     are accustomed to. Cryptic messages began to scroll up the screen.
     If you had booted a commercial OS, you would, at this point, be
     seeing a "Welcome to MacOS" cartoon, or a screen filled with clouds
     in a blue sky, and a Windows logo. But under Linux you get a long
     telegram printed in stark white letters on a black screen. There is
     no "welcome!" message. Most of the telegram has the semi-inscrutable
     menace of graffiti tags.

Dec 14 15:04:15 theRev syslogd 1.3-3#17: restart.
Dec 14 15:04:15 theRev kernel: klogd 1.3-3, log source = /proc/kmsg started.
Dec 14 15:04:15 theRev kernel: Loaded 3535 symbols from /System.map.
Dec 14 15:04:15 theRev kernel: Symbols match kernel version 2.0.30. Dec 14 15:04
:15 theRev kernel: No module symbols loaded.
Dec 14 15:04:15 theRev kernel: Intel MultiProcessor Specification v1.4
Dec 14 15:04:15 theRev kernel: Virtual Wire compatibility mode.
Dec 14 15:04:15 theRev kernel: OEM ID: INTEL Product ID: 440FX APIC at: 0xFEE000
00
Dec 14 15:04:15 theRev kernel: Processor #0 Pentium(tm) Pro APIC version 17
Dec 14 15:04:15 theRev kernel: Processor #1 Pentium(tm) Pro APIC version 17
Dec 14 15:04:15 theRev kernel: I/O APIC #2 Version 17 at 0xFEC00000.
Dec 14 15:04:15 theRev kernel: Processors: 2
Dec 14 15:04:15 theRev kernel: Console: 16 point font, 400 scans
Dec 14 15:04:15 theRev kernel: Console: colour VGA+ 80x25, 1 virtual console (ma
x 63)
Dec 14 15:04:15 theRev kernel: pcibios_init : BIOS32 Service Directory structure
 at 0x000fdb70
Dec 14 15:04:15 theRev kernel: pcibios_init : BIOS32 Service Directory entry at
0xfdb80
Dec 14 15:04:15 theRev kernel: pcibios_init : PCI BIOS revision 2.10 entry at 0x
fdba1
Dec 14 15:04:15 theRev kernel: Probing PCI hardware.
Dec 14 15:04:15 theRev kernel: Warning : Unknown PCI device (10b7:9001). Please
read include/linux/pci.h
Dec 14 15:04:15 theRev kernel: Calibrating delay loop.. ok - 179.40 BogoMIPS
Dec 14 15:04:15 theRev kernel: Memory: 64268k/66556k available (700k kernel code
, 384k reserved, 1204k data)
Dec 14 15:04:15 theRev kernel: Swansea University Computer Society NET3.035 for
Linux 2.0
Dec 14 15:04:15 theRev kernel: NET3: Unix domain sockets 0.13 for Linux NET3.035
Dec 14 15:04:15 theRev kernel: Swansea University Computer Society TCP/IP for NE
T3.034
Dec 14 15:04:15 theRev kernel: IP Protocols: ICMP, UDP, TCP
Dec 14 15:04:15 theRev kernel: Checking 386/387 coupling... Ok, fpu using except
ion 16 error reporting.
Dec 14 15:04:15 theRev kernel: Checking 'hlt' instruction... Ok.
Dec 14 15:04:15 theRev kernel: Linux version 2.0.30 (root@theRev) (gcc version 2
.7.2.1) #15 Fri Mar 27 16:37:24 PST 1998
Dec 14 15:04:15 theRev kernel: Booting processor 1 stack 00002000: Calibrating d
elay loop.. ok - 179.40 BogoMIPS
Dec 14 15:04:15 theRev kernel: Total of 2 processors activated (358.81 BogoMIPS)