Unix Philosophy

   Unix philosophy is one of the most important and essential approaches to
   [1]programming (and by extension all [2]technology design) which advocates
   great [3]minimalism and is best known by the saying that a program should
   only do one thing and do it well. Unix philosophy is a collective
   [4]wisdom, a set of design recommendations evolved during the development
   of one of the earliest (and most historically important) [5]operating
   systems called [6]Unix, hence the name. Having been defined by [7]hackers
   (the true, old style ones) the philosophy naturally advises for providing
   a set of many highly effective tools that can be combined in various ways,
   i.e. to perform [8]hacking, rather than being restricted by a fixed,
   intended functionality of huge do-it-all programs. Unix philosophy
   advocates [9]simplicity, clarity, modularity, reusability and composition
   of larger programs out of very small programs rather than designing huge
   monolithic programs as a whole. Unix philosophy, at least partially, lives
   on in many project and Unix-like operating systems such as
   [10]GNU/[11]Linux (though GNU/Linux distros are more and more distancing
   from Unix), has been wholly adopted by groups such as [12]suckless and
   [13]LRS (us), and is even being reiterated in such projects as [14]plan9.

   NOTE: see also [15]everything is a file, another famous design principle
   of Unix -- this one is rather seen as a Unix-specific design choice rather
   than part of the general Unix philosophy itself, but it helps paint the
   whole picture.

   As written in the [16]GNU coreutils introduction, a Swiss army knife
   (universal tool that does many things at once) can be useful, but it's not
   a good tool for experts at work, they note that a professional carpenter
   will rather use a set of relatively simple, highly specialized tools, each
   of which is extremely efficient at its job. Unix philosophy brings this
   observation over to the world of expert programmers.

   In 1978 [17]Douglas McIlroy has written a short overview of the Unix
   system (UNIX Time-Sharing System) in which he gives the main points of the
   system's style; this can be seen as a summary of the Unix philosophy (the
   following is paraphrased):

    1. Each program should do one thing and do it well. Overcomplicating
       existing programs isn't good; for new functionality create a new
       program.
    2. Output of a program should be easy to interpret by another program. In
       Unix programs are chained by so called [18]pipes in which one program
       sends its output as an input to another, so a programmer should bear
       this in mind. [19]Interactive programs should be avoided if possible.
       Make your program a [20]filter if possible, as that exactly helps this
       case.
    3. Program so that you can test early, don't be afraid to throw away code
       and rewrite it from scratch.
    4. Write and use tools, even if they're [21]short-lived, they're better
       than manual work. Unix-like systems are known for their high
       [22]scriptability.

   This has later been condensed into: do one thing well, write programs to
   work together, make programs communicate via text streams, a universal
   interface.

   Details about to what extent/extreme this minimalism ("doing only one
   thing") should be taken are of course a hot topic of countless debates and
   opinions, the original Unix hackers are often highly strict, famous
   example of which is the "cat -v considered [23]harmful" presentation
   bashing a relatively simple function added to the [24]cat program that
   should only ever concatenate files. Some tolerate adding a few convenience
   functions to trivial programs, especially [25]nowadays.

   Simple example: likely the most common practical example that can be given
   is [26]piping small [27]command line utility programs; inside a Unix
   system there live a number of small programs that do only one thing but do
   it well, for example the [28]cat program that only concatenates and
   outputs the content of selected files, the [29]grep program that searches
   for patterns in text etc. In command line we may use so called [30]pipes
   to chain some of these simple programs into more complex processing
   [31]pipelines by redirecting one program's output stream to another one's
   input. Let's say we want to for example automatically list all first and
   second level headings on given webpage and write them out alphabetically
   sorted. We can do it with a command such as this one:

 wget -q -O - "http://www.tastyfish.cz/lrs/main.html" | grep -i -o "<h[12][^>]*>[^<]*<" | sed "s/[^>]*> *\([^ ][^<]*[^ ]\) *<.*/\1/g" | sort

   Which may output for example:

 Are You A Noob?
 Did You Know
 less_retarded_wiki
 Topics
 Wanna Help?
 Welcome To The Less Retarded Wiki
 What Is Less Retarded Software/Society/Wiki?

   In the command the pipes (|) chain multiple programs together so that the
   output of one becomes the input of the next. The first command, [32]wget,
   downloads the [33]HTML content of the webpage and passes it to the second
   command, [34]grep, which [35]filters the text and only prints lines with
   headings (using so called [36]regular expressions), this is passed to
   [37]sed that removes the HTML code and the result is passed to sort that
   sorts the lines alphabetically -- as this is the last command, the result
   is then printed out, but we could also e.g. add > output.txt at the end to
   save the result into a text file instead. We also use [38]flags to modify
   the behavior of the programs, for example -i tells grep to work in
   case-insensitive mode, -q tells wget to be silent and not print things
   such as download progress. [39]This whole wiki is basically made on top of
   a few scripts like this (compare e.g. to [40]MediaWiki software), so you
   literally see the manifestation of these presented concepts as you're
   reading this. This kind of "workflow" is a fast, powerful and very
   flexible way of processing data for anyone who knows the Unix tools.
   Notice the relative simplicity of each command and how each one works as a
   [41]text [42]filter; text is a universal communication interface and
   behaving as a filter makes intercommunication easy and efficient,
   utilizing the principle of a [43]pipeline. A filter simply takes an input
   stream of data and outputs another stream of data; it ideally works
   on-the-go (without having to load whole input in order to produce the
   output), which has numerous advantages, for example requiring only a small
   amount of memory (which may become significant when we are running many
   programs at once in the pipeline, imagine e.g. a server with 10000 users,
   each one running his own commands like this) and decreasing [44]latency
   (the next pipe stage may start processing the data before the previous
   stage finishes). When you're writing a program, such as for example a
   [45]compression tool, make it work like this.

   Compare this to the opposing [46]Windows philosophy in which combining
   programs into collaborating units is not intended, is possibly even
   purposefully prevented and therefore very difficult, slow and impractical
   to do -- such programs are designed for manually performing some
   predefined actions, mostly using [47]GUI, e.g. painting pictures with a
   mouse, but aren't designed to collaborate with each other or be
   automatized, they can rarely be used in unintended, inventive ways needed
   for powerful [48]hacking. Returning to the example of a compression tool,
   on Windows such a program would be a large GUI program that requires a
   user to open up a file dialog, manually select a file to compress, which
   then might even do nasty things like loading the whole file into memory
   (because anyone who can afford Windows can also afford a lot of [49]RAM),
   perform compression there, and then writing the data back to some other
   file. Need to use the program on a computer without graphical display?
   Automatize it to work with other programs? Run it from a script? Run it
   10000 at the same time with 10000 other similar programs? Bad luck,
   Windows philosophy doesn't allow this.

   Watch out! Do not misunderstand Unix philosophy. There are many extremely
   dangerous cases of misunderstanding Unix philosophy by [50]modern
   [51]wannabe programmers who can't tell [52]pseudominimalism apart from
   true [53]minimalism. One example is the hilarious myth about "[54]React
   following Unix philosophy" ([55]LMAO this), the devs just show so many
   misunderstandings here -- firstly of course [56]JavaScript itself is
   extremely [57]bloated as it's a language aiming for things like comfort,
   rapid development, "safety" and beginner friendliness to which it
   sacrifices performance and elegance, an expert hacker trying to write
   highly thought through, optimized program is not its target group,
   therefore nothing based on JavaScript can ever be compatible with the Unix
   way in the first place. Secondly they seem to imply that basically any
   system of modules follows Unix philosophy -- that's of course wrong,
   modularity far predates Unix philosophy, Unix philosophy is more than
   that, merely having a package system of libraries, each of which focuses
   on some thing (even very broad one like highly complex [58]GUI), doesn't
   mean those tools are simple (both internally and externally), efficient,
   communicating in good ways and so on.

   Does Unix philosophy imply [59]universality is always bad? Well, most
   likely no, not in general at least -- it simply tells us that for an
   expert to create art that reaches the peak of his potential it seems best
   in most cases if he lives in an environment with many small, highly
   efficient tools that he can tinker with, which allow him to combine them,
   even (and especially) in unforeseen ways -- to do [60]hacking. Universal
   tools, however, are great as well, either as a supplement or for other use
   cases (non-experts, quick dirty jobs and so on) -- after all a general
   purpose [61]programming language such as [62]C, another creation of Unix
   creators themselves, is a universal tool that prefers generality over
   effectiveness at one specific task (for example you can use C to process
   text but you likely won't match the efficiency of [63]sed, etc.).
   Nevertheless let us realize an important thing: a universal tool can still
   be implemented in minimalist way, therefore never confuse a universal tool
   with a bloated monolith encumbered by feature creep!

   { One possible practical interpretation of Unix philosophy I came up with
   is this: there's an upper but also lower limit on complexity. "Do one
   thing" means the program shouldn't be too complex, we can simplify this to
   e.g. "Your program shouldn't surpass 10 KLOC". "Do it well" means the
   programs shouldn't bee too trivial because then it is hardly doing it
   well, we could e.g. say "Your program shouldn't be shorter than 10 LOC".
   E.g. we shouldn't literally make a separate program for printing each
   ASCII symbol, such programs would be too simple and not doing a thing
   well. We rather make a [64]cat program, that's neither too complex nor too
   trivial, which can really print any ASCII symbol. By this point of view
   Unix philosophy is really about balance of triviality and huge complexity,
   but hints that the right balance tends to be much closer to the triviality
   than we humans are tempted to intuitively choose. Without guidance we tend
   to make programs too complex and so the philosophy exists to remind us to
   force ourselves to rather minimize our programs to strike the correct
   balance. ~drummyfish }

See Also

     * [65]LRS
     * [66]Unix
     * [67]minimalism
     * [68]suckless
     * [69]KISS
     * [70]Windows philosophy
     * [71]hacking

Links:
1. programming.md
2. tech.md
3. minimalism.md
4. wisdom.md
5. os.md
6. unix.md
7. hacking.md
8. hacking.md
9. kiss.md
10. gnu.md
11. linux.md
12. suckless.md
13. lrs.md
14. plan9.md
15. everything_is_a_file.md
16. gnu.md
17. mcilroy.md
18. pipe.md
19. interactive.md
20. filter.md
21. throwaway_script.md
22. script.md
23. harmful.md
24. cat.md
25. modern.md
26. pipe.md
27. cli.md
28. cat.md
29. grep.md
30. pipe.md
31. pipeline.md
32. wget.md
33. html.md
34. grep.md
35. filter.md
36. regex.md
37. sed.md
38. flag.md
39. lrs_wiki.md
40. mediawiki.md
41. text.md
42. filter.md
43. pipeline.md
44. latency.md
45. compression.md
46. windows_philosophy.md
47. gui.md
48. hacking.md
49. ram.md
50. modern.md
51. soydev.md
52. pseudominimalism.md
53. minimalism.md
54. react.md
55. http://img.stanleylieber.com/src/20872/img/small.1527773532.png
56. js.md
57. bloat.md
58. gui.md
59. universality.md
60. hacking.md
61. programming_language.md
62. c.md
63. sed.md
64. cat.md
65. lrs.md
66. unix.md
67. minimalism.md
68. suckless.md
69. kiss.md
70. windows_philosophy.md
71. hacking.md