Declining software quality
Jan. 24th, 2017 06:43 amI was chatting recently with somebody who shares my observations on the apparent decline of the quality of software and figured that I may as well capture the broad strokes of my thoughts here.
Certainly back in the 1980s, maybe also the 1990s, software often impressed me. I could easily figure out how to use it, how to get it to do what I wanted, and it was genuinely a surprise to discover a bug. These days I often find software to be difficult to get on with, trying to force me into specific workflows of what somebody thinks users ought to want to be able to do, and positively bug-ridden.
One thing I am curious about is to what extent this trend is quantitatively corroborated. Organizations like NASA put research money into assessing software quality. I wonder what they have found.
Another is, what's the cause? Is it that there are so many people now entering the software industry that their typical ability level is much lower? Is it the false economy of building one's software atop a whole stack of complex third-party libraries that are all too often both buggy and barely maintained? I suspect that those two are connected: that it may take a certain skill level to recognize a steaming pile of festering liability just as it takes a decent programmer to be able to recognize one in recruitment interviews. Or maybe they have to use the third-party cruft because they don't see how they can easily just build the part they need themselves and wouldn't recognize clean elegance even if it lived next door to them.
Or maybe it's a social change in what's acceptable: professional pride giving way to commercial expediency, a market failure of rewarding feature lists over reliability, etc. Perhaps I am a poor fit for the modern world in caring more about software being clear and correct rather than featureful.
I see so much ( poor quality software )
I am not against learning new things, I just find few of them worthwhile. ( some of them are, though )
Certainly back in the 1980s, maybe also the 1990s, software often impressed me. I could easily figure out how to use it, how to get it to do what I wanted, and it was genuinely a surprise to discover a bug. These days I often find software to be difficult to get on with, trying to force me into specific workflows of what somebody thinks users ought to want to be able to do, and positively bug-ridden.
One thing I am curious about is to what extent this trend is quantitatively corroborated. Organizations like NASA put research money into assessing software quality. I wonder what they have found.
Another is, what's the cause? Is it that there are so many people now entering the software industry that their typical ability level is much lower? Is it the false economy of building one's software atop a whole stack of complex third-party libraries that are all too often both buggy and barely maintained? I suspect that those two are connected: that it may take a certain skill level to recognize a steaming pile of festering liability just as it takes a decent programmer to be able to recognize one in recruitment interviews. Or maybe they have to use the third-party cruft because they don't see how they can easily just build the part they need themselves and wouldn't recognize clean elegance even if it lived next door to them.
Or maybe it's a social change in what's acceptable: professional pride giving way to commercial expediency, a market failure of rewarding feature lists over reliability, etc. Perhaps I am a poor fit for the modern world in caring more about software being clear and correct rather than featureful.
I see so much ( poor quality software )
I am not against learning new things, I just find few of them worthwhile. ( some of them are, though )