[HN Gopher] Cold-blooded software
___________________________________________________________________
 
Cold-blooded software
 
Author : arbesman
Score  : 262 points
Date   : 2023-12-28 13:22 UTC (9 hours ago)
 
web link (dubroy.com)
w3m dump (dubroy.com)
 
| armchairhacker wrote:
| Counterpoint: some types of software aren't meant to last long.
| Even if it still builds and can be worked on later, the usecase
| itself may have changed or disappeared, or someone has probably
| come up with a new better version, so that it's no longer worth
| it to continue.
| 
| This probably doesn't apply to many types of software over 6
| months, but in a couple years or a couple decades. Some online
| services like CI or package managers will almost certainly
| provide backwards-compatible service until then.
| 
| Another possibility is that developer efficiency improves so much
| that the code written 10 years ago is easier to completely
| rewrite today, than it is to maintain and extend.
| 
| This is why I'm hesitant to think about software lasting decades,
| because tech changes so fast it's hard to know what the next
| decade will look like. My hope is that in a few years, LLMs
| and/or better developer tools will make code more flexible, so
| that it's very easy to upgrade legacy code and fix imperfect
| code.
 
  | iamthepieman wrote:
  | "Another possibility is that developer efficiency improves so
  | much that the code written 10 years ago is easier to completely
  | rewrite today, than it is to maintain and extend."
  | 
  | This seems completely false to me and I'm curious what has
  | caused you to believe this as I'm a fairly imaginative and
  | creative person yet I cannot imagine a set of circumstance that
  | would lead someone to this conclusion.
  | 
  | In other words, I disagree so very strongly with that statement
  | that I wanted to engage rather than just downvote. (I didn't
  | btw).
  | 
  | I agree with your first statement though and I don't think the
  | op is saying only make cold-blooded projects.
 
    | dartos wrote:
    | Well. I don't know if I agree or not, but felt like playing
    | devils advocate.
    | 
    | take the example of game development. Trying to maintain,
    | say, the hobbit game from the early 2000s to today would
    | almost certainly take more work than just making a new one
    | from scratch today (GPUs have changed drastically over the
    | past 20 years and making simple 3d platformers with unreal is
    | so easy, "asset flips" are a new kind of scam)
    | 
    | Or a tool which lets people visually communicate over vast
    | distances without specialized hardware.
    | 
    | That was a huge lift in the 2000s when Skype was the only
    | major player, but you can find tutorials for it now using
    | webrtc.
 
| languagehacker wrote:
| Cold-blooded software seems like a great idea in spaces where the
| security risk and business impact are low. I can think of a lot
| of great hobbyist uses for this approach, like a handmade
| appliance with Arduino or Raspberry Pi.
| 
| The ever-evolving threat landscape at both the OS and application
| level makes this unviable for projects with any amount of money
| or sensitivity behind them. Imagine needing to handle an OS-level
| update and learning that you can no longer run Python 2 on the
| box you're running that project on. Fine for a blog, but
| calamitous for anything that handles financial transactions.
 
  | kardianos wrote:
  | If you are a bank, a store, or handle PHI, you will have
  | contractual obligations to maintain it. However, I still think
  | that can be "cold-blooded" maintenance. When I update a Go
  | project after running `govulncheck ./...`, it is generally
  | easy. I vendor; builds and runtime only rely on systems I
  | control.
 
    | apantel wrote:
    | Many large companies and business like banks and
    | manufacturers run legacy code in ancient runtimes. The
    | projects can be so frozen in time that nobody has the courage
    | to touch them.
 
  | joshuaissac wrote:
  | > Cold-blooded software seems like a great idea in spaces where
  | the security risk and business impact are low. I can think of a
  | lot of great hobbyist uses for this approach, like a handmade
  | appliance with Arduino or Raspberry Pi.
  | 
  | I think it would be the other way around. A low-impact hobby
  | project can use exciting, fast-moving technology because if it
  | breaks, there is not so much damage (move fast and break
  | things). But something with high business impact should use
  | boring, tried-and-tested technologies no external network
  | dependencies (e.g. a package being available in a third-party
  | repository at compile time or runtime). For something like
  | that, the OS updates (on the LTS branch if Linux) would be
  | planned well ahead, and there would be no surprises like the
  | Python 2 interpreter suddenly breaking.
 
  | chuckadams wrote:
  | Meh, just keep a container around with py2 in it, maybe just
  | containerize the whole app. The ultimate in vendored
  | dependencies, short of a whole VM image.
 
| 082349872349872 wrote:
| I suspect our differences in preferences for cold- vs warm-
| blooded projects may be related to the "Buxton Index" as
| mentioned in
| https://www.cs.utexas.edu/users/EWD/transcriptions/EWD11xx/E...
 
  | chrisweekly wrote:
  | Curious, I read the linked transcript to find:
  | 
  | "My third remark introduces you to the Buxton Index, so named
  | after its inventor, Professor John Buxton, at the time at
  | Warwick University. The Buxton Index of an entity, i.e. person
  | or organization, is defined as the length of the period,
  | measured in years, over which the entity makes its plans. For
  | the little grocery shop around the corner it is about 1/2,for
  | the true Christian it is infinity, and for most other entities
  | it is in between: about 4 for the average politician who aims
  | at his re-election, slightly more for most industries, but much
  | less for the managers who have to write quarterly reports. The
  | Buxton Index is an important concept because close co-operation
  | between entities with very different Buxton Indices invariably
  | fails and leads to moral complaints about the partner. The
  | party with the smaller Buxton Index is accused of being
  | superficial and short-sighted, while the party with the larger
  | Buxton Index is accused of neglect of duty, of backing out of
  | its responsibility, of freewheeling, etc.. In addition, each
  | party accuses the other one of being stupid. The great
  | advantage of the Buxton Index is that, as a simple numerical
  | notion, it is morally neutral and lifts the difference above
  | the plane of moral concerns. The Buxton Index is important to
  | bear in mind when considering academic/industrial co-
  | operation."
 
    | salawat wrote:
    | Holy shit. I am so using this as a communication clarifying
    | tool. Nice concept.
 
    | apantel wrote:
    | Great concept, but isn't it just "time horizon"? Everyone
    | knows "time horizon".
 
      | Jtsummers wrote:
      | Not everyone knows it, strangely, many of the (senior or
      | junior) project management-types I work with have to be
      | introduced to the term and concept (and if they listen it
      | can at least resolve confusion, if not conflict, about the
      | different priorities and behaviors of all the parties
      | involved). But yes, they describe the same thing.
 
| jollyllama wrote:
| This is why I am trying to switch as many projects I'm on as
| possible to HTMX. The churn involved with all of the frontend
| frameworks means that there's far too much update work needed
| after letting a project sit for N quarters.
 
  | mikewarot wrote:
  | I googled HTMX, all excited that maybe, just maybe, the browser
  | people got their shit together and came up with a framework we
  | can all live with, something native to the browser with a few
  | new tags, and no other batteries required....
  | 
  | and was disappointed to find it's just a pile of other
  | libraries 8(
 
    | dartos wrote:
    | Everything is a pile of libraries.
    | 
    | It's a pile of someone else's code all the way down.
 
      | diggan wrote:
      | You can also use the web platform straight up without
      | transpilation, build tools, post-css compilation and all
      | that jazz.
      | 
      | Just vanilla JavaScript, CSS, HTML, some sprinkles of
      | WebComponents. And you can be pretty sure that you won't
      | have to update that for a decade or more, as compatibility
      | won't be broken in browsers.
      | 
      | Heck, I have vanilla JS projects I wrote 15 years ago that
      | still render and work exactly like how they rendered/worked
      | when I wrote them.
 
        | jollyllama wrote:
        | Indeed, that baggage is all that I avoid by using HTMX.
 
        | diggan wrote:
        | You do you. It's worth knowing though that using HTMX is
        | not vanilla JS/HTML/CSS, it's literally the opposite of
        | that.
 
        | kugelblitz wrote:
        | It's one small dependency. Worst case, you write the
        | library yourself.
        | 
        | You send a request to the backend, it then sends you HTML
        | back (all rendered in the backend using a templating
        | language such as Django templating engine, Twig or
        | Liquid), you insert it into a div or so.
        | 
        | Htmx was Intercooler, worst case you create your own. But
        | no additional scripts needed.
        | 
        | I've been able to kick out Vue out because Htmx covers my
        | use case.
 
    | replwoacause wrote:
    | Nothing to be disappointed in here AFAICT, however, it's
    | shocking that you had to Google HTMX, seeing as it shows up
    | on HN a few times a month at least.
 
      | diggan wrote:
      | I'm guessing the disappointing feeling come from parent
      | saying "Pff, I'm so tired of all these libraries that
      | eventually update their APIs in a breaking way, so now I'm
      | using X" while X is just another library exactly like all
      | the rest, and will surely introduce a breaking change or
      | two down the line.
 
        | jollyllama wrote:
        | You're arguing from the abstract point of view, rather
        | than the practical. The point is that it takes an order
        | of magnitude more time to clone, say, a Vue project from
        | three years ago that nobody has touched since then and
        | try to download your dependencies and build on a new
        | machine, as compared to an HTMX project.
 
| ryanar wrote:
| I really appreciate this idea after rewriting my blog engine
| three times because the frameworks I was using (Next, Remix) had
| fundamental changes after a year and I was multiple major
| versions behind. Though it depends on what you are after. If the
| goal is to be able to blog, time spent upgrading and rewriting
| code because the framework is evolving is wasted time unless you
| want to stay up to date with that framework. Think about how we
| view physical goods today, they aren't built to last. In certain
| situations, like a personal blog, you want reliable software that
| works for years without the need to change. It also helps to have
| software that uses common data formats that are exportable to
| another system, like a blog based in markdown files, rather than
| JSX.
 
| layer8 wrote:
| I'm glad this isn't about ruthless software.
 
| vrnvu wrote:
| one thing I've noticed is that many engineers, when they're
| looking for a library on Github, they check the last commit time.
| They think that the more recent the last commit is, the better
| supported the library is.
| 
| But what about an archived project that does exactly what you
| need it to do, has 0 bugs, and has been stable for years? That's
| like finding a hidden gem in a thrift store!
| 
| Most engineers I see nowadays will automatically discard a
| library that is not "constantly" updated... Implying it's a good
| thing :)
 
  | pmichaud wrote:
  | Even though it's not strictly true, checking for recent updates
  | is an excellent heuristic. I don't know the real numbers, but I
  | feel confident that in the overwhelming majority of cases, no
  | recent activity means "abandoned", not "complete and bug free".
 
  | scruple wrote:
  | I'm generally doing that to check for version compatibility
  | across a much broader spectrum than the level of a single
  | library.
 
  | fabian2k wrote:
  | A library can only stay static if the environment it's used in
  | is also static. And many of the environments in which modern
  | software is developed are anything but static, web frontends
  | are one example where things change quite often.
  | 
  | A library that can stand entirely on its own might be fine if
  | it's never updated. But e.g. a library that depends on a web
  | frontend framework will cause trouble if it is not updated to
  | adapt to changes in the ecosystem.
 
    | adonovan wrote:
    | Also, even a very stable project that is "done" will receive
    | a trickle of minor tweak PRs (often docs, tests, and
    | cleanups) proportional to the number of its users, so the
    | rate of change never falls to zero until the code stops being
    | useful.
 
      | diggan wrote:
      | > so the rate of change never falls to zero until the code
      | stops being useful
      | 
      | Non-useful software changes all the time ;) Also, Useful
      | software stands still all the time, without any proposed
      | changes.
 
      | derefr wrote:
      | I think this is also in inverse proportion to the arcane-
      | ness of the intended use of the code, though.
      | 
      | Your average MVC web framework gets tons of these minor
      | contributors, because it's easy to understand MVC well
      | enough to write docs or tests for it, or to clean up the
      | code in a way that doesn't break it.
      | 
      | Your average piece of system software gets some. The Linux
      | kernel gets a few.
      | 
      | But ain't nobody's submitting docs/tests/cleanups for an
      | encryption or hashing algorithm implementation. (In fact,
      | AFAICT, these are often implemented _exactly once_ , as a
      | reference implementation that does things in the same weird
      | way -- using procedural abstract assembler-like code, or
      | transpiled functional code, or whatever -- that the journal
      | paper describing the algorithm did; and then not a hair of
      | that code is ever touched again. Not to introduce comments;
      | not to make the code more testable; _definitely_ not to
      | refactor things. Nobody ever reads the paper except the
      | original implementor, so nobody ever truly understands what
      | parts of the code are critical to its functioning  /
      | hardening against various attacks, so nobody can make real
      | improvements. So it just sits there.)
 
      | josephg wrote:
      | I disagree. Tiny libraries can be fine indefinitely. For
      | example this little library which inverts a promise in
      | JavaScript.
      | 
      | I haven't touched this in years and it still works fine. I
      | could come in and update the version of the dependencies
      | but I don't need to, and that's a good thing.
      | 
      | https://github.com/josephg/resolvable
 
        | xmprt wrote:
        | I think total number of commits is probably a good metric
        | too. If the project only has 7 commits to begin with then
        | it's unlikely to get any more updates after it's "done".
        | But a 10 year old project with 1000 commits where the
        | last commit was 3 years ago is a little more worrying.
 
    | zer00eyz wrote:
    | >> web frontends are one example where things change quite
    | often.
    | 
    | There is a world of difference between linux adding USB
    | support and how web front ends have evolved. One of them
    | feels like they are chasing the latest shiny object...
 
    | xmprt wrote:
    | As someone who migrated a somewhat old project to one which
    | uses a newer framework, I agree with this. The amount of time
    | I spent trying to figure out why and old module was broken
    | before realizing that one of it's dependencies was using ESM
    | even though it was still using CJS... I don't even want to
    | think about it. Better to just make sure that a module was
    | written or updated within the last 3 years because that will
    | almost certainly work.
 
    | tedunangst wrote:
    | This is a very strange example. Browsers have fantastic
    | backwards compatibility. You can use the same libraries and
    | framework you used ten years ago to make a site and, with
    | very few exceptions, it will work perfectly fine in a modern
    | browser.
 
      | kazinator wrote:
      | The problem arises when you're not using old libraries and
      | frameworks. You're using new stuff, and come across an old,
      | unmaintained library you'd like to use.
      | 
      | Hey, it uses the same frameworks you're using --- except,
      | oh, ten years ago.
      | 
      | Before you can use it, you have to get it working with the
      | versions of those frameworks you're using today.
      | 
      | Someone did that already before you. They sent their patch
      | to the dead project, but didn't get a reply, so nobody
      | knows about it.
 
      | hiatus wrote:
      | You absolutely can do that, but it is likely the final
      | output will have numerous exploitable vulnerabilities.
 
      | crabmusket wrote:
      | Browsers themselves aren't usually the problem. While
      | sometimes they make changes, like what APIs are available
      | without HTTPS, I think you're right about their solid
      | backwards compatibility.
      | 
      | What people really mean when they talk about the frontend
      | is the build system that gets your (modern, TypeScript)
      | source code into (potentially Safari) browsers.
      | 
      | Chrome is highly backwards compatible. Webpack, not so
      | much. This build system churn goes hand-in-hand with
      | framework churn (e.g. Vue 2 to 3, while the team have put
      | heaps of effort into backwards compatibility, is not
      | automatic), and more recently, the rise of TypeScript, and
      | the way the CJS to ESM transition has been handled by tools
      | (especially Node).
 
    | LeifCarrotson wrote:
    | Even if the environment it's used in is not static, the world
    | it lives in is not static.
    | 
    | I work in industrial automation, which is a slow-moving
    | behemoth full of $20M equipment that get commissioned once
    | and then run for decades. There's a lot of it still
    | controlled with Windows 98 PCs and VB6 messes and PXI cards
    | from the 90s, even more that uses SLC500 PLCs.
    | 
    | But when retrofitting these machines or building new ones,
    | I'll still consider the newness of a tool or library. Modern
    | technology is often lots more performant, and manufacturers
    | typically support products for date-on-market plus 10 years.
    | 
    | There's definitely something to be said for sticking with
    | known good products, but even in static environments you may
    | want something new-ish.
 
  | hiAndrewQuinn wrote:
  | The Haskell community has a lot of these kinds of libraries. It
  | comes with the territory to some extent.
 
    | samus wrote:
    | The GHC project churns out changes at a quite high rate
    | though. The changes are quite small by themselves, but they
    | add up and an abandoned Haskell project is unlikely to be
    | compilable years later.
 
  | diggan wrote:
  | I remember seeing a bunch of graphs which showed how
  | programming languages have changed over time, and how much of
  | the original code is still there.
  | 
  | It showed that some languages were basically nothing like the
  | 1.0 versions, while others had retained most of the code
  | written and only stuff on top.
  | 
  | In the end, it seems to also be reflected in the community and
  | ecosystem. I remember Clojure being close/at the top of the
  | list as the language hardly does breaking changes anymore, so
  | libraries that last changed 5 years ago, still run perfectly
  | well in the current version of the language.
  | 
  | I guess it helps that it's lisp-like as you can extend the core
  | of the language without changing it upstream, which of course
  | also comes with its own warts.
  | 
  | But one great change it did to me, is stop thinking that
  | "freshness" equals "greatness". It's probably more common I use
  | libraries today that basically stopped changed since some years
  | ago, than I use libraries that were created in the last year.
  | And without major issues.
 
  | Uehreka wrote:
  | By zero bugs do you mean zero GitHub issues? Because zero
  | GitHub issues could mean that there are security
  | vulnerabilities but no one is reporting them because the
  | project is marked as abandoned.
 
    | diggan wrote:
    | > By zero bugs do you mean zero GitHub issues?
    | 
    | Or, the library just have zero bugs. It's possible, although
    | probably pretty uncommon :)
 
  | bratbag wrote:
  | Most engineers have probably been bitten in the ass by
  | versioned dependencies conflicting with each other.
 
    | wccrawford wrote:
    | And the other way, too, with the underlying language's
    | changes making the library stop working.
    | 
    | It's just really unlikely that a project stays working
    | without somewhat-frequent updates.
 
  | troupe wrote:
  | If you are asking yourself, "will this do what it says it will
  | do?" and you are comparing a project that hasn't had any
  | updates in the last 3 years vs one that has seen a constant
  | stream of updates over the last 3 years, which one do you think
  | has a greater probability of doing what it needs to do?
  | 
  | Now I do get your point. There is probably a better metric to
  | use. Like for example, how many people are adding this library
  | to their project and not removing it. But if you don't have
  | that, the number of recent updates to a project that has been
  | around for a long time is probably going to steer you in the
  | right direction more often than not.
 
  | pizzafeelsright wrote:
  | Good point. I have also seen Great Endeavor 0.7.1 stay there
  | because the author gave up or graduated or got hired and the
  | repo sits incomplete, lacking love and explanation for
  | dismissal.
 
  | duped wrote:
  | > But what about an archived project that does exactly what you
  | need it to do, has 0 bugs, and has been stable for years?
  | That's like finding a hidden gem in a thrift store!
  | 
  | Either the library is so trivial to implement myself that I
  | just do that anyway, which doesn't have issues w.r.t
  | maintenance or licensing, or it's unmaintained and there are
  | bugs that won't be fixed because it's unmaintained and now I
  | need to fork and fix it, taking on a legal burden with
  | licensing in addition to maintenance.
  | 
  | Bugs happen all the time for mundane reasons. A transitive
  | dependency updated and now an API has a breaking change but the
  | upstream has security fixes. Compilers updated and now a weird
  | combination of preprocessor flags causes a build failure. And
  | so on.
  | 
  | The idea that a piece of software that works today will work
  | tomorrow is a myth for anything non-trivial, which is why
  | checking the history is a useful smell test.
 
    | QuadmasterXLII wrote:
    | I submit math.JS and numeric.JS. Math.JS has an incredibly
    | active community and all sorts of commits numeric. JS is one
    | file of JavaScript and hasn't had an update in eight years if
    | you want to multiply 2 30 by 30 matrices, numeric.JS works
    | just fine in 2023 and is literally 20 times faster.
 
    | derefr wrote:
    | Consider an at-the-time novel hashing algorithm, e.g. Keccak.
    | 
    | * It's decidedly non-trivial -- you'd have to 1. be a
    | mathematician/cryptographer, and then 2. read the paper
    | describing the algorithm and _really_ understand it, before
    | you could implement it.
    | 
    | * But also, it's usually just one file with a few hundred
    | lines of C that just manipulates stack variables to turn a
    | block of memory into another block of memory. Nothing that
    | changes with new versions of the language. Nothing that rots.
    | Uses so few language features it would have compiled the same
    | 40 years ago.
    | 
    | Someone writes such code once; nobody ever modifies it again.
    | No bugs, unless they're bugs in the algorithm described by
    | the paper. Almost all libraries in HLLs are FFI wrappers for
    | the same one core low-level reference implementation.
 
      | duped wrote:
      | In practice, this code will use a variety of target-
      | specific optimizations or compiler intrinsics blocked
      | behind #ifdefs that need to be periodically updated or
      | added for new targets and toolchains. If it refers to any
      | kind of OS-specific APIs (like RNG) then it will also need
      | to be updated from time to time as those APIs change.
      | 
      | That's not to say that code can't change slowly, just the
      | idea that it _never_ changes is extremely rare in practice.
 
      | tedunangst wrote:
      | Keccak is perhaps not the best example to pick.
      | https://mouha.be/sha-3-buffer-overflow/
 
  | derefr wrote:
  | Depends on the language.
  | 
  | Some languages have releases every year or two where they will
  | introduce some new, elegant syntax (or maybe a new stdlib ADT,
  | etc) to replace some pattern that was frequent yet clumsy in
  | code written in that language. The developer communities for
  | these languages then usually pretty-much-instantly consider use
  | of the new syntax to be "idiomatic", and any code that still
  | does things the old, clumsy way to need fixing.
  | 
  | The argument for making the change to any particular codebase
  | is often that, relative to the new syntax, the old approach
  | makes things more opaque and harder to maintain / code-review.
  | If the new syntax existed from the start, nobody would think
  | the old approach was good code. So, _for the sake of legibility
  | to new developers, and to lower the barrier to entry to code
  | contributions_ , the code should be updated to use the new
  | syntax.
  | 
  | If a library is implemented in such a language, and yet it
  | hasn't been updated in 3+ years, that's often a bad sign -- a
  | sign that the developer isn't "plugged into" the language's
  | community enough to keep the library up-to-date as idiomatic
  | code that other developers (many of whom might have just
  | learned the language in its latest form from a modern resource)
  | can easily read. And therefore that the developer maybe isn't
  | _interested_ in receiving external PRs.
 
  | NanoYohaneTSU wrote:
  | I'm sort of confused on where your comment is coming from. In
  | the modern world (2023 in case your calendar is stuck in the
  | 90s) we have a massive system of APIs and services that get
  | changed all the time internally.
  | 
  | If a library is not constantly updated then there is a high
  | likely hood (99%) that it just won't work. Many issues raised
  | in git are that something changed and now the package is
  | broken. That's reality sis.
 
    | bee_rider wrote:
    | Are you suggesting that all we need to do is use 30 year old
    | languages to free ourselves from this treadmill? That seems
    | like an easy choice!
 
  | RhodesianHunter wrote:
  | That's only true for libraries with zero transitive
  | dependencies.
  | 
  | Otherwise you're almost guaranteed to be pulling in un-patched
  | vulnerabilities.
 
| kageiit wrote:
| For many use cases, cold-blooded software is not viable. We need
| better tools to automate and remove the tedium involved in
| upgrading dependencies or modernizing codebases to protect
| against ever evolving threats and adapt to changes in the
| ecosystem
 
| tlhunter wrote:
| I love the sentiment of this post. I absolutely hate that my
| recent mobile apps from only a couple years ago l now require a
| dozen hours to patch them up and submit updates.
| 
| The author's final point is interesting wherein they refer to
| their own static site generator as being cold-blooded and that it
| runs on Python 2. Python 2 is getting harder to install recently
| and will eventually make it a warm blooded project.
 
  | ryandrake wrote:
  | I have a little hobby project (iOS and macOS) that I don't
  | regularly develop anymore, but I use it quite often as a user,
  | and I like to keep it compiling and running on the latest OSes.
  | It's aggravating (and should be totally unacceptable) that
  | every time I upgrade Xcode, I have a few odds and ends that
  | need to be fixed in order for the project to compile cleanly
  | and work. My recent git history comments are _all_ variations
  | of  "Get project working on latest Xcode".
  | 
  | I could almost understand if these underlying SDK and OS
  | changes had to be made due to security threats, but that's
  | almost never the case. It's just stupid things like deprecating
  | this API and adding that warning by default and "oh, now you
  | need to use this framework instead of that one". Platforms and
  | frameworks need to stop deliberately being moving targets,
  | especially operating systems that are now very stable and
  | reliable.
  | 
  | I should be able to pull a 10 year old project out of the
  | freezer and have it compile cleanly and run just as it ran 10
  | years ago. These OS vendors are trillion dollar companies. I
  | don't want to hear excuses about boo hoo how much engineering
  | effort backward compatibility is.
 
    | dartos wrote:
    | Hardware changes over 10 years.
    | 
    | Macs don't even run on the same CPU architecture or support
    | OpenGL.
    | 
    | Sometimes things just need to change.
 
      | beambot wrote:
      | The worst is when your virtualization environments intended
      | to provide long-term support don't even accomodate the
      | "new" mainline hardware. Most frustrating example:
      | Virtualbox doesn't work on Apple M1 or M2 chipsets.
 
    | angra_mainyu wrote:
    | Apple is notoriously bad when it comes to this.
    | 
    | I used to work on a cross-platform product and Windows was
    | relatively stable across versions, as was Linux.
    | 
    | Macs on the other hand required a lot of branching for each
    | version.
 
| sowbug wrote:
| A link to this article would be an effective curt reply to the
| "is this project dead?" GitHub issues that have been known to
| enrage and discourage cold-blooded project owners.
 
  | bee_rider wrote:
  | I wonder if GitHub is a bad fit for cold blooded projects? It
  | has social media elements, I'd expect lots of extra chatter and
  | "engagement."
 
| coreyp_1 wrote:
| This. This, so very much!
| 
| I built my websites on Drupal 7 and have enjoyed a decade of
| stability. Now, with D7 approaching EOL in 1 year, I'm looking
| for a solution that will last another decade. There's no reason
| for the EOL, either, other than people wanting to force everyone
| to move on to a newer version. It undoubtedly means more business
| for some people, as they will be able to reach out to their
| clients and say, "Your website is about to be a security risk, so
| you have to pay to update it!" Unfortunately, it means more work
| for me to support my personal projects.
| 
| And why? Because someone somewhere has decided that I should move
| on to something newer and more exciting. But I don't want new and
| exciting... I want rock solid!
| 
| I'm on vacation this week. Am I learning a new hot language like
| Rust, Zig, Go, etc.?
| 
| Nope.
| 
| I have no desire to. I don't trust them to be the same in a
| decade, anyway.
| 
| I'm focusing on C. It's far more enjoyable, and it's stable.
 
  | dartos wrote:
  | Enjoyable is subjective. I can't think of anything less
  | enjoyable than hunting for segfaults in C.
  | 
  | I'd call Go pretty rock solid at this point. Modern go vs
  | decade old go isn't very different. Maybe just the packages
  | tools had 1 major changed.
  | 
  | You'd get the same thing in C if your hardware significantly
  | changes in the 10 years too.
 
    | coreyp_1 wrote:
    | Haha... I agree about it being subjective! I find that I
    | enjoy the process as much as the result. It's like bringing
    | order to a chaotic universe. :)
    | 
    | The thing is, I don't have many segfaults in C, and I find C
    | much easier to debug and hunt down issues in than even C++
    | (which I also enjoy). Also, because C uses very little
    | "magic", and I also know exactly what I'm getting with my
    | code, I find it much easier to reason about.
    | 
    | I heard a quote the other day while watching a presentation
    | "When you're young you want results, when you're old you want
    | control." I think I'm on the old side now.
    | 
    | As for Go, I genuinely don't have anything against it, but I
    | don't see why I need it either. I don't doubt that others
    | have stellar use cases and impressive results with Go, and
    | that's fine, too, but I don't sense any lack which prompts me
    | to investigate further. I would love to learn more about it,
    | but most of what I see online is either over-the-top (and
    | therefore vomit-inducing) fanboyism, or otherwise
    | unspectacular, which makes me ask "why bother?"
 
  | flir wrote:
  | https://backdropcms.org/ ? D7 fork. If you want to stay there.
 
| hiAndrewQuinn wrote:
| Most of the software I write is at least somewhat cold-blooded by
| this definition. My program to find the dictionary forms of
| Finnish words is an _okay_ example:
| 
| https://github.com/hiAndrewQuinn/finstem
| 
| I wrote the initial draft in an afternoon almost a year ago, and
| from then on endeavored to only make changes which I know play
| nicely with my local software ecology. I usually have `fzf`
| installed, so an interactive mode comes as a shell script. I
| usually have `csvkit`, `jq`, and if all else fails `awk`
| installed, so my last major update was to include flags for CSV,
| JSON, and TSV output respectively. Etc, etc.
| 
| The build instructions intentionally eschew anything like Poetry
| and just gives you the shell commands I would run on a fresh
| Ubuntu VirtualBox VM. I hand test it every couple of months in
| this environment. If the need to Dockerize it ever arose I'm sure
| it would be straightforward, in part because the shell commands
| themselves are straightforward.
| 
| I don't call it a great example because the CLI library I use
| could potentially change. Still, I've endeavored to stick to only
| relatively mature offerings.
 
| csdvrx wrote:
| I follow a similar approach but maybe more extreme : whenever
| possible, I use "YESTERDAY'S TECHNOLOGY TOMORROW"
| 
| It's nicely presented on
| http://itre.cis.upenn.edu/~myl/languagelog/archives/000606.h...
| 
| > I want yesterday's technology tomorrow. I want old things that
| have stood the test of time and are designed to last so that I
| will still be able to use them tomorrow. I don't want tomorrow's
| untested and bug-ridden ideas for fancy new junk made available
| today because although they're not ready for prime time the
| company has to hustle them out because it's been six months since
| the last big new product announcement. Call me old-fashioned, but
| I want stuff that works.
| 
| The same thing is true with free software: I prefer to use the
| terminal. In the terminal, I prefer to run bash and vim, not zsh
| and neovim.
| 
| When I write code, I've found C (and perl!) to be preferable,
| because "You can freeze it for a year and then pick it back up
| right where you left off."
| 
| There are rare exceptions, when what's new is so much better than
| the previous solution (ex: Wayland) that it makes sense to move.
| 
| However, that should be rare, and you should be very sure. If you
| think you made the wrong choice, you can always move back to your
| previous choice: after playing with ZFS for a few years, I'm
| moving some volumes back to NTFS.
| 
| Someone mentions how the author choice (python2) is getting
| harder to install. Cold blooded software works best when done
| with multiplatform standards, so I'd suggest the author does the
| bare minimum amount of fixes necessary to run with
| https://cosmo.zip/pub/cosmos/bin/python and call it a day.
| 
| With self-contained APEs and the eventual emulator when say 20
| years from now we move to Risc V, you don't have to bother about
| dependencies, updates or other form of breakage: compile once in
| a APE form (statically linked for Windows/Linux/BSD/MacOS) it
| will run forever by piggybacking on the popularity of the once-
| popular platform.
| 
| Wine lets you run Windows 95 binaries about 30 years layer: I'd
| bet than Wine + the Windows part of the APE will keep running
| long after the kernel break the ABI.
 
| slaymaker1907 wrote:
| Besides what is stated in the article, it is also important to
| have an inherently secure threat model. For example, full
| websites are inherently warm-blooded since you are constantly
| dealing with attackers, spam bots, etc. However, static pages
| like Tiddlywiki are a lot better since you can avoid putting it
| on the web at all and browsers are incredibly stable platforms.
 
| imran-iq wrote:
| Python is a really bad example of cold blooded software. There is
| constant breaking changes with it (both runtime and tooling). So
| much so that the author still has to use python2 which has been
| EOL'd for quite a while.
| 
| A much better example would be something like go or java where 10
| year old code still runs fine with their modern tooling. Or an
| even better example, perl, where 30 year old code still runs fine
| to this day
 
  | 082349872349872 wrote:
  | 2 and 3 don't really differ that much; true cold-blooded
  | software doesn't care which it's being run with.
 
  | JohnFen wrote:
  | Agreed. This is one of the reasons why I avoid using Python
  | whenever possible. Python code I write today is unlikely to be
  | functional years from now, and I consider that a pretty huge
  | problem.
 
    | heurist wrote:
    | This really depends on your environment. I've been running
    | legacy Python servers continuously for 4+ years without
    | breaking them or extensively modifying them because I
    | invested in the environment and tooling around it (which I
    | would do for any app I deploy). I can't say I want to bring
    | all of them entirely up to date with dependencies, but
    | they're still perfectly functional. Python is pretty great,
    | honestly. I rarely need to venture into anything else (for
    | the kind of work I do).
 
      | JohnFen wrote:
      | > I've been running legacy Python servers continuously for
      | 4+ years
      | 
      | That seems like a large amount of effort to make up for a
      | large language deficiency. My (heartfelt) kudos to you!
      | 
      | I might have been willing to do the same if I used Python
      | heavily (I don't because there are a number of other things
      | that makes it very much "not for me") -- but it would still
      | represent effort that shouldn't need to be engaged in.
 
  | frizlab wrote:
  | Go is not a good example either. Some times ago we tried
  | compiling a code a few years after it was made, it did not
  | work. Someone who actually knew the language and tooling tried
  | and said there was a migration to be done and it was
  | complicated. I have not followed the subject up close but in
  | the end they just abandoned IIRC.
 
  | blakesley wrote:
  | Regarding Python: Really? Obviously v2-to-v3 was an absolute
  | fiasco, but since then, it's been great in my personal
  | experience.
  | 
  | Don't get me wrong: Python hasn't overcome its tooling problem,
  | so there's still that barrier. But once your team agrees on a
  | standardized tool set, you should be able to coast A-OK.
 
  | TheNewAndy wrote:
  | Depends if you mean python the interpreter or python the
  | language. e.g. pypy still supports python2 and has "indefinite
  | support" or something along those lines.
  | 
  | Even the cpython2 interpreter is no longer supported by the
  | original authors, but that doesn't stop someone else from
  | supporting it.
 
  | Aeolun wrote:
  | > java where 10 year old code still runs fine with their modern
  | tooling
  | 
  | I don't know about you. But even when I try to run 3 year old
  | Java code with a new SDK it's always broken _somehow_.
 
    | ahoka wrote:
    | Years? After one year, something amongst the hundreds of deps
    | will have a horrible security vulnerability and updating
    | means breaking changes.
 
| iqandjoke wrote:
| How about security?
 
| ganzuul wrote:
| https://en.wikipedia.org/wiki/Unix_philosophy
| 
| Seems related. Tools built like this which still need constant
| updating must have a foundation of sand.
 
| aranchelk wrote:
| In my mind this is a lot more about tooling and platform than
| language, library, architecture, etc.
| 
| I have a project that's quite complicated and built on fast-
| moving tech, but with every element of the build locked down and
| committed in SCM: Dockerfiles, package sets, etc.
| 
| Alternatively, one of my older projects uses very stable slow-
| moving tech. I never took the time to containerize and codify the
| dependencies. It runs as an appliance and is such a mess that
| it's cheaper to buy duplicates of the original machine that it
| ran on and clone the old hard drive rather than do fresh
| installs.
 
| blastbking wrote:
| I had this experience making an iOS game. After a few years of
| making the game, I went back to it, and found that I was unable
| to get it to compile. I guess iOS games are very warm blooded.
| Perhaps if I had stuck with a desktop platform or web it would
| have remained fine? Not entirely sure.
 
  | yellow_lead wrote:
  | Mobile in general is this way. For instance, on Android, if
  | your app isn't targeting a high enough sdk version, Google will
  | remove it after some time. If you have to upgrade your target
  | sdk, you may find many libraries are broken (or not supported),
  | and it also can lead to other cascades of upgrades, like having
  | to upgrade gradle or the NDK if you use it.
 
| petercooper wrote:
| I think Go's backward compatibility promise -
| https://go.dev/blog/compat - would make much Go software 'cold
| blooded' by this definition (so long as you vendor dependencies!)
 
| aeternum wrote:
| What a terrible name for this. Cold blooded animals are highly
| dependent on their environment whereas the body of warm-blooded
| animals eliminate the dependency on external temperature via
| metabolism.
| 
| In any case, it's unnecessarily ambiguous. Why not simply say
| 'software without external dependencies' and eliminate the
| paragraphs of meandering explanation?
 
| kugelblitz wrote:
| I've been maintaining my own side project. It started 12-13 years
| ago, with vanilla php, later rewritten with Laravel, later
| rewritten again with Symfony in 2017-ish. Since then I've had
| phases from 6-18 months where I had a total of 2-3 tiny commits
| (I was working full time as a freelancer, so I didn't have energy
| to work on my side project). But then when I had time, I would
| focus on it, add features, upgrade and just experiment and learn.
| 
| This was super valuable to me to learn how to maintain projects
| long-term: Update dependencies, remove stuff you don't need,
| check for security updates, find chances to simplify (e.g. from
| Vagrant to Docker... or from Vue + Axios + Webpack + other stuff
| to Htmx). And what to avoid... for me it was to avoid freshly
| developed dependencies, microservices, complexified
| infrastructure such as Kubernetes.
| 
| And now I just developed a bunch of features, upgraded to PHP 8.2
| and Symfony 7 (released a month ago), integrated some ChatGPT-
| based features and can hopefully relax for 1-3 years if I wanted
| to.
| 
| In the last 4-5 years the project has made about the same revenue
| as an average freelance year's revenue, so it's not some dormant
| unknown side project.
 
  | Aeolun wrote:
  | I think PHP, as horrible as it feels to go back, is one example
  | of something that's truly backwards compatible even to its own
  | detriment.
  | 
  | Haven't worked with it for years, went back to find that the
  | horrible image manipulation functions are still the same mess
  | that I left behind 8 years ago.
 
| oooyay wrote:
| This got me thinking if any of my side projects or work projects
| that are in maintenance mode could qualify as "cold blooded".
| Conceptually, they can - I have many projects written in Go,
| Typescript, and Python where I could cache my dependencies (or at
| least the SHAs) and do what this is implying. The problem is that
| it stops being useful beyond proving the concept. In reality, all
| my projects have a slow churn that usually has to do with
| vulnerability updates. Maybe more aptly put, "Can I take this Go
| repository off the shelf, rebuild the binary, and let it run?";
| the answer is of course - assuming HTML and web standards haven't
| changed too much. The problem is that then some old vulnerability
| could be immediately used against it. The assumption I also made,
| that HTML and web standards haven't changed too much, will almost
| assuredly be falsey. They may have not have changed enough to be
| breaking, but they'll have certainly changed to some degree; the
| same can be said for anyone that's developed desktop applications
| for any OS. The one constant is change. Either side of that coin
| seems to be a losing proposition.
 
| __natty__ wrote:
| In the Node & JavaScript ecosystem, there is the web framework
| Express. The current major version 4.x.x branch is over 10 years
| old [1]. And yet it powers so many apps in the ecosystem (over
| 17M downloads every week [2]). It lacks some features and is not
| the most performant [3]. But me and coworkers I talked with, like
| it because it allows for quick, stable development and long-term
| planning without worrying about drastic API changes and lack of
| security patches for older versions. Even better stability is
| provided with Go where we can run over 10-year-old programs
| thanks to a mix of wide stdlib and the promise of compatibility.
| [4]
| 
| [1] https://www.npmjs.com/package/express?activeTab=versions
| 
| [2] https://www.npmjs.com/package/express
| 
| [3] https://fastify.dev/benchmarks/
| 
| [4] https://go.dev/doc/go1compat
 
| ChrisMarshallNY wrote:
| I wrote an SDK, in 1994-95, that was still in use, when I left
| the company, in 2017.
| 
| It was a device control interface layer, and was written in
| vanilla ANSI C. Back when I wrote it, there wasn't a common
| linker, so the only way to have a binary interface, was to use
| simple C.
| 
| I have written stuff in PHP (5), that still works great, in PHP
| 8.2. Some of that stuff is actually fairly ambitious.
| 
| But it's boring, and has a low buzzword index.
 
| d_burfoot wrote:
| This essay showcased an excellent writing technique: at the
| outset, I had no idea what the title meant. But at the
| conclusion, it made perfect sense.
 
| js8 wrote:
| I work on IBM mainframe (z/OS). Nothing else I know comes as
| close in maintaining backwards compatibility as IBM. Microsoft
| (Windows) is the 2nd, I think. Linux (kernel) ABI has the 3rd
| place, but that's only a small portion of Linux ecosystem.
| 
| Almost everything else, it's just churn. In OSS this is common, I
| guess nobody wants to spend time on backward compatibility as a
| hobby. From an economic perspective, it looks like a prisoner's
| dilemma - everybody externalizes the cost of maintaining
| compatibility onto others, collectively creating more useless
| work for everybody.
 
___________________________________________________________________
(page generated 2023-12-28 23:00 UTC)