Determinism can be Inefficient

I've noticed in the recent years instances of once-deterministic systems being made unpredictable in
ostensible pursuit of more-efficient resource usage.  I find the trend disastrous, needing reversal.

Traffic lights are a prosaic example.  I could once learn how a particular intersection behaved, and
optimize my speed to avoid stopping as I approached it.  With the introduction of cameras, truly for
surveillance, which attempt to detect traffic flow and manipulate the lights in rhythm with it, I've
become unable to perform this optimization.  I've even come across faulty intersections which prefer
to allow traffic to flow only along the most common paths and, having no need to behave predictably,
will simply maintain that bias until manually adjusted by a driver along another path, and requiring
an attempt to determine how the particular mechanism there works, and manipulating it to triggering.

An example I've only seen others suffer is the Internet service using ``artificial intelligence'' to
control a basic mechanism, with the ostensible goal of optimizing avenues such as display.  However,
the end effect is users expending resources to discover means of circumventing the opaque mechanism.
That typical excuse for using unpredictable methods rather than affording direct control to users is
a total lack of trust, perhaps coupled with belittling through implication that it be too difficult.

Horrible program interfaces, best exemplified by WWW browsers, are my third example; a deterministic
interface can't be raced against.  Rather than have this, WWW browsers display incomplete pages, and
the model of the WWW prevents them from having the information to do this correctly, meaning without
intermittent garbage being displayed, and the race is how quickly one can send commands against this
before it changes again.  The wrongly-issued commands and network traffic surely outweigh any gains.

These are thus three examples of how poor design to save resources likely achieve just the opposite.
Non-determinism can easily be removed from a simple system, but not from a complex one.  Determinism
can easily be removed from any system, but adding it can be much more difficult.  Foundations should
again tend towards being mechanically predictable, lest they coalesce into an incomprehensible mess.
I see many idiots advocate, unawares of the harm, for adding complicated and unpredictable mechanism
to many things, without thought; naturally, idiots will only see the harm if it kills them, if then.

A world in which humans can't make accurate predictions of man-made machinations is a true dark one.