Entry tags:
Declining software quality
I was chatting recently with somebody who shares my observations on the apparent decline of the quality of software and figured that I may as well capture the broad strokes of my thoughts here.
Certainly back in the 1980s, maybe also the 1990s, software often impressed me. I could easily figure out how to use it, how to get it to do what I wanted, and it was genuinely a surprise to discover a bug. These days I often find software to be difficult to get on with, trying to force me into specific workflows of what somebody thinks users ought to want to be able to do, and positively bug-ridden.
One thing I am curious about is to what extent this trend is quantitatively corroborated. Organizations like NASA put research money into assessing software quality. I wonder what they have found.
Another is, what's the cause? Is it that there are so many people now entering the software industry that their typical ability level is much lower? Is it the false economy of building one's software atop a whole stack of complex third-party libraries that are all too often both buggy and barely maintained? I suspect that those two are connected: that it may take a certain skill level to recognize a steaming pile of festering liability just as it takes a decent programmer to be able to recognize one in recruitment interviews. Or maybe they have to use the third-party cruft because they don't see how they can easily just build the part they need themselves and wouldn't recognize clean elegance even if it lived next door to them.
Or maybe it's a social change in what's acceptable: professional pride giving way to commercial expediency, a market failure of rewarding feature lists over reliability, etc. Perhaps I am a poor fit for the modern world in caring more about software being clear and correct rather than featureful.
I see so much software that feels to me like it should never have gotten through even permissive quality assurance: for instance, I perform some input action, clicking a button or whatever, and there is no immediate visual change to confirm that my act was noticed, then the action goes on to do something that is both surprising and difficult to reverse. Even from companies that are meant to be good at user interfaces I see horrors, like from Apple: visually identical indicators having diverse meanings and actions having no visible effect on the corresponding status display. This is such basic stuff but people do not get it correct any more and it makes me want to find some other technical industry in which quality still matters.
I am not against learning new things, I just find few of them worthwhile.
Certainly back in the 1980s, maybe also the 1990s, software often impressed me. I could easily figure out how to use it, how to get it to do what I wanted, and it was genuinely a surprise to discover a bug. These days I often find software to be difficult to get on with, trying to force me into specific workflows of what somebody thinks users ought to want to be able to do, and positively bug-ridden.
One thing I am curious about is to what extent this trend is quantitatively corroborated. Organizations like NASA put research money into assessing software quality. I wonder what they have found.
Another is, what's the cause? Is it that there are so many people now entering the software industry that their typical ability level is much lower? Is it the false economy of building one's software atop a whole stack of complex third-party libraries that are all too often both buggy and barely maintained? I suspect that those two are connected: that it may take a certain skill level to recognize a steaming pile of festering liability just as it takes a decent programmer to be able to recognize one in recruitment interviews. Or maybe they have to use the third-party cruft because they don't see how they can easily just build the part they need themselves and wouldn't recognize clean elegance even if it lived next door to them.
Or maybe it's a social change in what's acceptable: professional pride giving way to commercial expediency, a market failure of rewarding feature lists over reliability, etc. Perhaps I am a poor fit for the modern world in caring more about software being clear and correct rather than featureful.
I see so much software that feels to me like it should never have gotten through even permissive quality assurance: for instance, I perform some input action, clicking a button or whatever, and there is no immediate visual change to confirm that my act was noticed, then the action goes on to do something that is both surprising and difficult to reverse. Even from companies that are meant to be good at user interfaces I see horrors, like from Apple: visually identical indicators having diverse meanings and actions having no visible effect on the corresponding status display. This is such basic stuff but people do not get it correct any more and it makes me want to find some other technical industry in which quality still matters.
I am not against learning new things, I just find few of them worthwhile.
Newso often does not mean
better. This has been true for a while: witness Java's failure to learn from Modula-3 in so many ways. There are some good developments, for example if I wanted PostScript now I expect I would be better off with Cairo. And, a big development for me in more recent years was Haskell: its casually convenient capacity for flexible abstraction means that when I need some piece of framework atop which to build my application, I can often easily roll it myself and it will generally turn out to be simple, correct and reusable. Haskell does also have some high-quality third-party libraries. But, it is not good for my employability to wall myself off in a niche, especially when living somewhere with a lower density of software jobs than I am used to. Things are not all bad: for instance, I would be happy to dust off my JavaScript given that browser support seems to become less of an awkward issue as time goes on and the new java.time library is an improvement over the previous date and calendar horror. But the new learning I do that feels pleasant tends to be of old technologies, like Erlang.