Against the Modern Web
By Edward Willis (http://encw.xyz)
Published Dec/29/2022

The Web, as it has evolved to become today, is a bloated mess. Visiting a
website today, say a popular news site, involves downloading many megabytes of
data, much of it scripting.

For example, visiting cnn.com involes over 700 requests, and more than 10MB of
data transferred. And that is only to start! Over the next couple of minutes,
the cnn homepage just sitting there, it balloons to over 1100 requests and 20MB.

How is that a reasonable way to convey information? For context, 20MB is enough
data to contain the entire text of more than two volumes of the Encyclopedia
Britannica. And yet all cnn.com manages to do with it is offer a menu, a
selection of article titles, and some advertisements.

With each passing year, the vast majority of large and popular websites become
more and more bloated. The Wayback Machine (https://web.archive.org), a popular
archive of webpages, has its earliest snapshot of cnn.com, that you can view,
recorded on Aug/15/2000. Including the Wayback Machine's menu system atop the
embedded CNN page, the whole page is 323KB as transferred. The first (of eleven)
snapshots on Aug/15/2010 is 7.1MB transferred. I'd argue the older pages were
more functional and informationally dense. Take a look for yourself.

So what's the problem? First of all, this increase weighs on more than just your
internet connection, but your computer itself too. Much of this extra data is
embeded software that is run on your computer. Then there is the bloat in the
browser software itself to view modern pages. You might have visited cnn.com in
the year 2000 with a computer running a 550Mhz Pentium 3, with 128MB of RAM,
via a 56K modem. Try navigating to cnn.com today with that same setup, and see
how that goes for you!

This might not seem like a problem to you: if your computer or whatever device
you use to browse the modern web gets slow, you simply buy a new one. But what
about those who cannot afford a relatively new computer? They are effectively
barred from using large swaths of the internet, accessing information and
services that those more fortunate take for granted. Across America, indeed,
much of the world, public libraries offer on-line computer access to everyone,
including those who have no other way to access the internet. Constantly
increasing computing requirements puts those libraries on the same upgrade
treadmill as the rest of us, which severely impacts their budgets and wastes
taxpayer money. Then there are our schools, which are strapped for cash at the
best of times, having to upgrade their computers. In fact, computers are used
extensively throughout the government at all levels. That's a ton of wasted tax
money because Web developers can't contain themselves.

Then there is the matter of the tremendous e-waste created when perfectly good,
functional computers, are made unusable by bloat.

The thing is that none of this bloat is necessary. People used the Web 20+ years
ago, and did much of what they do today. The bloat is the result of bad
programming practices, and endless feature creep in Web tech.

Programmers can and should do better. If they want to call themselves engineers
(they're not engineers) then they'd better develop a duty of care for the
effects of their work on society and the environment.