Remarks on Amdahl's Law

Amdahl's is yet another ``law'' I see used regularly in discussions about computer programming.  The
``law'' is the name brand of a trivial observation; the problem lies entirely in its misapplication.
That trivial observation is as follows: Optimizing part of a program, particularly for time, affects
the overall program only to the extent that the part is used; if a tenth of a program's running time
is spent in some part, then optimizing that part leaves the remaining nine tenths rather unaffected.

The misapplication of this law comes from what could perhaps be called ARGUMENTUM AD MUNDUM, that is
``argument to reality'' or ``argument to existence'', for it's misapplied as some fundamental reason
that programs can't be improved.  Such misapplication, which attempts to excuse the sorry state with
most software, is the entire reason such ``laws'' are so popular; instead of ignorance and stupidity
as the culprits, the sorry state of some software is supposedly a fundamental consequence of reality
for which no particularly meaningful improvements can be made.  Perhaps this could also be called by
the name ARGUMENTUM AD PRAESENTUM or ``argument to the present'', since mankind has so clearly never
been wrong for a measly few decades, and so whatever currently exists isn't very horribly misguided.

The observation's misapplication lies in conflation: An implementation's conflated with all possible
implementations, to some degree, yet it's clear many problems can be optimized past the limits given
by the ``law'' with a different approach.  I'd wager most problems can be solved in ways which scale
with the number of processors provided, so long as they involve no emulation of some serial machine.

I'm forever wary of ``laws'' in this immature field with a misnomer as a name, ``computer science''.