Conclusions
As you, dear reader can certainly tell, by establishing solid foundations we are capable of building future-proof software from scratch in relatively little time - without losing sight of any of the initially defined design goals. On the contrary: building upon these solid foundations allows us to actually create for maintainability and reproducibility while GNU Guix' toolbox enables us to build for verifiability by keeping track of both configuration as well as dependencies.
Keeping the Overview
Maintainability
Keeping the overview over specifics like package build parameters, (recursive) dependencies, system configuration or system service configuration can quickly be tedious in legacy systems that are in use to date. Historically grown operating systems often fail to enable clear upgrade paths, have no means of system replication (apart from binary duplication of root file-systems), tend to be laborious to inspect and debug - leaving administrators to fall back to questionable techniques like: setting up new machines instead of upgrading, trial-and-error with configuration file copying, relying on full-disk backups (for all machines in use), etc.
GNU Guix, as a manager of package and operating system definitions, build artifacts and a unifier of software configuration enables drastic steps into a future where single files or cleverly crafted modules enable versatile yet lean software products.
Verifiability
As demonstrated in this little project programmers need no more than one language to craft, inspect, debug or verify package, service or operating system definitions or, if need be, (almost) the entirety of GNU Guix' code base. This drastically closes - or at least narrows - the gap between contributors and users, enabling both novices to contribute to the project and intermediate hackers to chime in on broader or deeper design decisions.
Our tech-stacks ore complex enough as they are without us throwing more and more languages, tools and other, unnecessary dependencies at them.
Tackling Supply-Chains
Having an idea what software is needed to build and run your software is essential, especially in a world where dependencies on third-party products are one of the easier attack vectors for malicious parties.
At the very least, creators of software should be able to figure out what dependencies exist and what specific versions they depend on.
With GNU Guix this is almost laughably easy:
guix graph verteiler
prints a dependency graph of the verteiler web-application (as defined
in the teil-channel) in dot notation.
For a full dependency graph of the whole system in action you can use:
guix graph -t derivation $(guix system build -d path/to/system.scm)
If the teil-channel is not readily available in your profile, you will
need to replace the guix in these commands
with guix time-machine -C /path/to/channels.scm -- referring to this channels.scm file).
For practical reasons I will not share the graph of the whole system - the dot file created alone is 13MB in size and graphviz trying to fit all of the information into a single image using a single core seems unfit for the job.
The PNG rendering of the graph of the verteiler application itself is already rather large (21MB, 32767x4430 pixels).
Reproducibility
Once we reach a state where our software runs just the way we ought to persist it, pinning channels is as simple as running
guix describe --format=channels
or, if we make use of an exclusive channels file:
guix time-machine -C channels.scm -- describe --format=channels
Sharing this output alone and using it with the guix time-machine allows us to bit-by-bit reproduce all the
artifacts related to this project. There's no need for extra files,
big (or gigantic) binaries or other magic.
Bottom lines
While relying on legacy, off-the-shelf solutions might bring short-term advantages to developers (e.g. time to a minimal viable product by handling well-established technologies) this advancement of extreme programming leaves a trail of almost unmanageable, unverifiable and incredibly complex software stacks behind us - a situation where cleaning up is usually impossible to sell to customers (who wants to pay for something that already works?).
Crafting clean solutions not only enables clean workflows, tidy repositories and maintainable and verifiable software, it also allows to strive for resource preserving constructions resulting in both smaller and faster software. In a world where (finally) sustainability seems like somewhat people should strive for, a reasonable use of resources should be essential and at the very foundation of each venture. This of course entails hardware acquisition but absolutely needs to be reflected in software - Wirth's law is a memorial of regression and has to be abolished for a sustainable future.