Disillusionment with History

I've noticed lately how most of my general skills lean heavily towards computing; albeit deliberate,
this is still mildly disconcerting.  My knowledge of mathematics, history, and my choice of readings
all heavily, or entirely, concern the field of automatic computing.  Given what I know about history
in this, I've no doubt whatsoever the general world history is completely unreliable, resembling it.

The field of automatic computing is younger than one century, which is a large part of why I pursued
it, and already it has many forgotten realms, idiotic fiefdoms claimed by retards, liars, and cults.
Most make no attempt to learn any history, and those who make some attempt are most often led astray
by a cult; forgotten realms with different possibilities are explained away; retards command armies.

Beginning, automatic computing was created to save human labour and to provide correct answers; this
axiom has been forgotten, and people no longer truly understand for what reasons computers exist.  A
computer is not a series of digital levers, sparing users from flipping them by hand, but ability to
have one lever flip activate all or none, or any other pattern the machine can be taught; following,
the machine could be taught the meta-patterns of stimuli relating to these patterns, and to activate
them automatically, soon running autonomously, until encountering situations so new a human operator
must tell it how to proceed.  The goal isn't to flip levers, but to be able to entirely forget them.
Thus, when a man spends hours flipping digital levers, it's such an obscene act, against the spirit.

My chosen forgotten realms pursued this spirit of decreasing human labour.  The fiefdoms, liars, and
cults act against it.  It would be inappropriate to express this disgust with computing history, and
not mention UNIX, brimming with all three groups.  It's responsible for teaching countless people to
bend themselves to the machine, never daring to customize it in certain trivial ways, and then pride
themselves on this obscenity; the liars claim it was the first operating system written in a higher-
level language, they claim it had the first hierarchical file system, they claim an operating system
panicking is perfectly reasonable behaviour, they claim doing something once in the operating system
is worse than doing it in every program that uses it, they claim things must be this way or similar,
and they claim yet other vicious lies; and those fiefdoms are built on these foundations, justifying
complicated languages by making comparisons to the natural sciences despite there needing be no such
complications in a human construct, taking joy in writing incomprehensible programs, and mocking the
people with the good sense to look upon them with disgust, amongst other ways these fiefdoms attempt
to maintain their social control in spite of evidence.  Those who could stop them don't know better.

When I designed some of my programs, I could look upon their peculiarities, and look kindly upon the
little hacks I'd noticed were possible therein.  I also recognized those systems were ugly tangents,
and eliminated them.  The modern computer hasn't much undergone this necessary cleansing; meat hooks
hang from the ceiling, some holding rotten meat, many of them vestigial hooks, and a horrible stench
pervades the room.  The people living in these queer houses can't imagine architecture without such.

Transactions were created (I believe) to serve the SABRE airline reservation system.  It transformed
manual process, making it faster and more reliable.  I believe if people were faced with issues from
the machines nowadays, the first instinct would be to cope with it, if even a hack wouldn't be used.
A primary issue with automation is dependence on it and then coping with subpar automation of tasks.

It's obvious that computer hardware is influenced by software, and so calculations done repetitively
are candidate for promotion to the lower-level.  This was once done and, since machines which return
incorrect results are useless, part of the motivation was to create machines which could return such
results in fewer ways.  Historical momentum has made dumb hardware, and people claim simple problems
are intractable, yet relatively pointless calculations are still promoted to the hardware level, for
no reason other than making something uncommon slightly more efficent.  Modern systems still exhibit
failure cases that were eliminated decades prior, and the only progress against such are a series of
impotent ``mitigations''.  Hardware could and should perform ever more very common operations; speed
is not a concern, but a red herring, as it's never actually an issue.  A single debugging session in
a lesser machine eliminates and overwhelms any gains.  People don't know what machine design serves.

In nearly every instance I observe an old computer architecture discussed, I read it being described
as a RISC; it's irrelevant how many instructions are available, their forms, or what operations they
perform, as is how much memory was available to its machines; it will likely be described as a RISC.
I've also seen the supposedly rare CISC be described as RISC, because it's ``RISC internally''; this
dissonance of ideas that would lead to idiots giving everything a meaningless label makes sense when
these idiots are correctly identified as cultists.  There's a compulsion to colour everything with a
particular shade, so the dominion of the cult becomes the world.  This is a general pattern of note.

The purpose of a high-level language is to save human effort.  People rightfully recognize using the
machine code is arduous and prone to error, but have been duped into believing there ought be limits
to this abstracting away; rather than prefer ever smaller, ever clearer programs, people are tricked
into believing ``control'' and ``efficiency'' are admirable goals for mundane programs; making those
higher systems better is considered a waste of time, compared to improving the worse languages; this
malignant idiocy sits above machine code, and below decent languages, and has the names C, C++, with
a few others also common.  It's another label, no different than RISC or CISC, with fools using them
to claim a language as truly C or C++, because an implementation thereof is written therein.  Noting
the underlying machine code goes unnoticed, as it's clearly silly to claim a higher language is just
a machine code, and the cult gleefully persists with its idiotic behaviour, as it truly seeks power.

A language which removes effort from providing information to its base infrastructure, preferably to
avoid needing to even deign to directly mention ever more tangents, is always better, but this basic
fact has been forgotten in favour of using those inadequate languages which manage good reputations.

I'm forced to wonder if all wondrous technology goes through a phase such as computing currently is,
in which humans create it, and idiots build a community around needlessly abusing it.  Did operators
of early printing presses forget what that tool was for, or find it fine to print illegibly given it
was good enough; I know none of these incompetent programmers would enjoy it were operators of their
water infrastructure behaving so carelessly, retorting that an advanced user always boils his water.

My most recent review covered a paper from 1978 decrying how large languages and programs written in
them are baroque, unwieldy, weak, and unsuitable.  I recommended it, in part because I don't believe
most programmers have read it.  The situation has only grown worse in the following decades, without
evidence it will improve.  Many people have accepted certain programs should be very large and don't
attempt to fully understand them; smaller programs, even if arduous to fully understand, are better.

Recently, I've been more concerned with automating manual tasks I often perform.  Such automation is
unnecessarily difficult to achieve, and I become ever more aware of those increasingly basic tasks I
still perform manually, even if just a few keystrokes, but such keystrokes which span programs, thus
being more arduous.  Inverted program design, in which the program ``shell'' is cracked open, easily
inspected and modified, increasingly seems that only reasonable way to write an interactive program.
There's no reason a computer can't collect and remember its global and local event streams, which is
necessary for many kinds of trivial automation, but since this isn't the focus it simply isn't done.
Most every program should be customizable as it runs, and this should be opt-out, not opt-in as now.

These cases, and others I may describe as a later addition to this article, make it known most don't
understand the purpose of computing, and those who attempt to learn its history are often led astray
by cults; fiefdoms built on this foundation abound and collect ignorant followers; liars thrive with
these conditions; forgotten realms provide good answers to solved problems that still plague people.

The current machines give one a fake body through which to interact with the network, but this falls
so short of their capabilities; enabling these fake bodies isn't the purpose of the machines at all.

This dissonance of thought and horrifying, utter confusion with basic history is very disconcerting.
I don't believe the general history of the world, with its far greater motive to shape, is reliable.