Embrace the Bloat
Posted by Michał ‘mina86’ Nazarewicz on 16th of May 2021
‘I’m using slock as my screen locker,’ a wise man once said. He had a beard so surely he was wise.
‘Oh?’ his colleague raised a brow intrigued. ‘Did they fix the PAM bug?’ he prodded inquisitively. Nothing but a confused stare came in reply. ‘slock crashes on systems using PAM,’ he offered an explanation and to demonstrate his point, he approached a nearby machine and pressed the Return key.
Screens, blanked by a locker just a few minutes prior, came back to life, unlocked without the need to enter the password.
No, I wasn’t the one in the story, but I might just as well had been. At the time when I witnessed this conversation I was using slock as well preferring it over a ‘bloated’ xscreensaver.† In theory reducing ‘bloat’ is a sound idea. As Steve McConnel reports in Code Complete, there are 1-25 bugs per thousand lines of code. Therefore, using software with fewer lines of code reduces number of vulnerabilities one is exposed to.
Alas reducing ‘bloat’ is rarely a rational pursuit. Hardly anyone is analysing security implications or resource usage, rather people are motivated by a dogma proclaiming that ‘bloat’ is evil and software without ‘bloat’ is divine.‡
But that dogma is not always correct. slock has 395 lines of code while
xscreensaver has nearly half a million. And yet, time and time again, Jamie Zawinski has been proven right. Similarly, one could argue that LibreSSL has been a premature fork and free software community would have been better served by all resources focusing on OpenSSL rather than splitting some of the efforts to a library whose big selling point was removal of features.
A common argument brought up in discussion about ‘bloat’ is that of Unix philosophy to write programs which do one thing and do it well. For example, does GNU cat need a
--show-all switch? It could be replaced with a sed script and removing its handling would improve cat, right? Except what’s often lost in that reasoning is that now everyone has to write and maintain a non-trivial sed script rather than having a shared tool used by millions of people and maintained by a well established project.
Am I saying that we should push as much code into every application? Choose tools with the biggest binary size? No, of course not. Programs with well defined scope, devoid of unnecessary features have their advantages, but comparing number of lines of code is never a good metric by itself. It’s good to try and understand where those additional lines come from. Even if there is no difference in set of features between two programs, one might be larger because it’s better optimised or uses a more secure design.
Size of the executable or number of lines of code are a poor proxy for other, actually significant properties of a system. If you care about security, look at the design of a piece of software and how historically its project reacted to vulnerabilities. If you want maximum speed, first make sure that executable size is actually a problem (especially since smaller binary may mean fewer optimisations). If you are after minimising space, firsts make sure that you cannot spare those few cents to get another GB of storage. Make sure you’re optimising what you actually care about and an irrelevant proxy metric.
† It was just pure luck that I wasn’t affected by the bug. My account password was long and complicated. While good for security it got annoying fast when I had to type it each time I returned with a fresh mug of tea. Because of that I was running a modified version of slock which verified entered password against a hard-coded hash rather than my system credentials. As an aside, this had additional benefit of preventing my account password being leaked if I ever tried to unlock a screen after forgetting to lock it beforehand.
‡ Worse still, some people seem to be motivated by a ‘GNU is bad’ attitude. It’s not uncommon for Internet forums to witness someone trying to claim GNU tools are bad by bringing up number of lines of code or size of the binaries.